Test Report: Docker_Linux_crio_arm64 22186

                    
                      5e28b85a1d78221970a3d6d4a20cdd5c3710ee83:2025-12-18:42830
                    
                

Test fail (44/316)

Order failed test Duration
38 TestAddons/serial/Volcano 0.32
44 TestAddons/parallel/Registry 17.09
45 TestAddons/parallel/RegistryCreds 0.5
46 TestAddons/parallel/Ingress 143.96
47 TestAddons/parallel/InspektorGadget 6.29
48 TestAddons/parallel/MetricsServer 5.41
50 TestAddons/parallel/CSI 50.25
51 TestAddons/parallel/Headlamp 3.4
52 TestAddons/parallel/CloudSpanner 5.46
53 TestAddons/parallel/LocalPath 8.62
54 TestAddons/parallel/NvidiaDevicePlugin 5.27
55 TestAddons/parallel/Yakd 6.28
78 TestFunctional/serial/SoftStart 464.23
80 TestFunctional/serial/KubectlGetPods 3
90 TestFunctional/serial/MinikubeKubectlCmd 3.01
91 TestFunctional/serial/MinikubeKubectlCmdDirectly 3.17
171 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy 503
173 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart 369.03
175 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods 2.64
185 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd 2.35
186 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly 2.42
187 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig 735.07
188 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth 2.28
191 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService 0.06
194 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd 1.73
197 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd 3.09
201 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect 2.36
203 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim 241.66
213 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels 1.41
219 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel 0.55
222 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup 0.1
223 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect 125.44
228 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp 0.06
229 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List 0.26
230 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput 0.27
231 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS 0.28
232 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format 0.25
233 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL 0.26
237 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port 2.33
293 TestJSONOutput/pause/Command 1.68
299 TestJSONOutput/unpause/Command 2.1
358 TestKubernetesUpgrade 796.4
384 TestPause/serial/Pause 6.93
481 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 7200.079
x
+
TestAddons/serial/Volcano (0.32s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:852: skipping: crio not supported
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable volcano --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable volcano --alsologtostderr -v=1: exit status 11 (317.840478ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:14:31.241153 1166493 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:14:31.242010 1166493 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:31.242051 1166493 out.go:374] Setting ErrFile to fd 2...
	I1218 00:14:31.242078 1166493 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:31.242470 1166493 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:14:31.242864 1166493 mustload.go:66] Loading cluster: addons-399099
	I1218 00:14:31.243702 1166493 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:31.243733 1166493 addons.go:622] checking whether the cluster is paused
	I1218 00:14:31.243866 1166493 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:31.243882 1166493 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:14:31.245051 1166493 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:14:31.279891 1166493 ssh_runner.go:195] Run: systemctl --version
	I1218 00:14:31.279955 1166493 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:14:31.305575 1166493 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:14:31.414839 1166493 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:14:31.414985 1166493 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:14:31.444131 1166493 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:14:31.444155 1166493 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:14:31.444160 1166493 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:14:31.444164 1166493 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:14:31.444168 1166493 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:14:31.444171 1166493 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:14:31.444174 1166493 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:14:31.444197 1166493 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:14:31.444201 1166493 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:14:31.444207 1166493 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:14:31.444213 1166493 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:14:31.444217 1166493 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:14:31.444258 1166493 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:14:31.444261 1166493 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:14:31.444265 1166493 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:14:31.444269 1166493 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:14:31.444273 1166493 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:14:31.444288 1166493 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:14:31.444291 1166493 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:14:31.444294 1166493 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:14:31.444301 1166493 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:14:31.444306 1166493 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:14:31.444310 1166493 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:14:31.444313 1166493 cri.go:89] found id: ""
	I1218 00:14:31.444380 1166493 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:14:31.461119 1166493 out.go:203] 
	W1218 00:14:31.463990 1166493 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:31Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:31Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:14:31.464010 1166493 out.go:285] * 
	* 
	W1218 00:14:31.471690 1166493 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_9bd16c244da2144137a37071fb77e06a574610a0_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_9bd16c244da2144137a37071fb77e06a574610a0_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:14:31.474664 1166493 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable volcano addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable volcano --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/serial/Volcano (0.32s)

                                                
                                    
x
+
TestAddons/parallel/Registry (17.09s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:384: registry stabilized in 12.880214ms
addons_test.go:386: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:353: "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Running
addons_test.go:386: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 6.003167398s
addons_test.go:389: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:353: "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Running
addons_test.go:389: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 6.00341529s
addons_test.go:394: (dbg) Run:  kubectl --context addons-399099 delete po -l run=registry-test --now
addons_test.go:399: (dbg) Run:  kubectl --context addons-399099 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:399: (dbg) Done: kubectl --context addons-399099 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.541086943s)
addons_test.go:413: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 ip
2025/12/18 00:14:57 [DEBUG] GET http://192.168.49.2:5000
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable registry --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable registry --alsologtostderr -v=1: exit status 11 (272.522784ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:14:57.718711 1167453 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:14:57.719507 1167453 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:57.719541 1167453 out.go:374] Setting ErrFile to fd 2...
	I1218 00:14:57.719562 1167453 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:57.719854 1167453 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:14:57.720161 1167453 mustload.go:66] Loading cluster: addons-399099
	I1218 00:14:57.720615 1167453 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:57.720655 1167453 addons.go:622] checking whether the cluster is paused
	I1218 00:14:57.720804 1167453 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:57.720834 1167453 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:14:57.721394 1167453 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:14:57.740472 1167453 ssh_runner.go:195] Run: systemctl --version
	I1218 00:14:57.740526 1167453 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:14:57.766012 1167453 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:14:57.872605 1167453 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:14:57.872697 1167453 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:14:57.903912 1167453 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:14:57.903934 1167453 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:14:57.903939 1167453 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:14:57.903944 1167453 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:14:57.903948 1167453 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:14:57.903951 1167453 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:14:57.903955 1167453 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:14:57.903958 1167453 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:14:57.903961 1167453 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:14:57.903969 1167453 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:14:57.903973 1167453 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:14:57.903976 1167453 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:14:57.903979 1167453 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:14:57.903983 1167453 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:14:57.903986 1167453 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:14:57.903995 1167453 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:14:57.903998 1167453 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:14:57.904003 1167453 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:14:57.904006 1167453 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:14:57.904009 1167453 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:14:57.904015 1167453 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:14:57.904021 1167453 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:14:57.904024 1167453 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:14:57.904031 1167453 cri.go:89] found id: ""
	I1218 00:14:57.904081 1167453 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:14:57.918894 1167453 out.go:203] 
	W1218 00:14:57.921757 1167453 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:57Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:57Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:14:57.921780 1167453 out.go:285] * 
	* 
	W1218 00:14:57.929545 1167453 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_94fa7435cdb0fda2540861b9b71556c8cae5c5f1_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_94fa7435cdb0fda2540861b9b71556c8cae5c5f1_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:14:57.932419 1167453 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable registry addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable registry --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/Registry (17.09s)

                                                
                                    
x
+
TestAddons/parallel/RegistryCreds (0.5s)

                                                
                                                
=== RUN   TestAddons/parallel/RegistryCreds
=== PAUSE TestAddons/parallel/RegistryCreds

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/RegistryCreds
addons_test.go:325: registry-creds stabilized in 9.816044ms
addons_test.go:327: (dbg) Run:  out/minikube-linux-arm64 addons configure registry-creds -f ./testdata/addons_testconfig.json -p addons-399099
addons_test.go:334: (dbg) Run:  kubectl --context addons-399099 -n kube-system get secret -o yaml
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable registry-creds --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable registry-creds --alsologtostderr -v=1: exit status 11 (268.698897ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:34.788788 1168532 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:34.789496 1168532 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:34.789512 1168532 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:34.789519 1168532 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:34.789784 1168532 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:34.790104 1168532 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:34.790507 1168532 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:34.790528 1168532 addons.go:622] checking whether the cluster is paused
	I1218 00:15:34.790638 1168532 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:34.790653 1168532 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:34.791157 1168532 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:34.810276 1168532 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:34.810327 1168532 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:34.828823 1168532 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:34.934371 1168532 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:34.934506 1168532 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:34.962127 1168532 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:34.962153 1168532 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:34.962157 1168532 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:34.962161 1168532 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:34.962164 1168532 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:34.962168 1168532 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:34.962171 1168532 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:34.962174 1168532 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:34.962177 1168532 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:34.962187 1168532 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:34.962190 1168532 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:34.962194 1168532 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:34.962197 1168532 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:34.962200 1168532 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:34.962204 1168532 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:34.962212 1168532 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:34.962218 1168532 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:34.962225 1168532 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:34.962228 1168532 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:34.962231 1168532 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:34.962235 1168532 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:34.962240 1168532 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:34.962243 1168532 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:34.962246 1168532 cri.go:89] found id: ""
	I1218 00:15:34.962298 1168532 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:34.977355 1168532 out.go:203] 
	W1218 00:15:34.980397 1168532 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:34Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:34Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:34.980421 1168532 out.go:285] * 
	* 
	W1218 00:15:34.988156 1168532 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_ac42ae7bb4bac5cd909a08f6506d602b3d2ccf6c_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_ac42ae7bb4bac5cd909a08f6506d602b3d2ccf6c_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:34.991221 1168532 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable registry-creds addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable registry-creds --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/RegistryCreds (0.50s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (143.96s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:211: (dbg) Run:  kubectl --context addons-399099 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:236: (dbg) Run:  kubectl --context addons-399099 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:249: (dbg) Run:  kubectl --context addons-399099 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:254: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:353: "nginx" [eeac0ce7-b316-4351-9540-0bee961fb9f1] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "nginx" [eeac0ce7-b316-4351-9540-0bee961fb9f1] Running
addons_test.go:254: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 8.003630737s
I1218 00:15:18.284298 1159552 kapi.go:150] Service nginx in namespace default found.
addons_test.go:266: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:266: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'": exit status 1 (2m11.381340294s)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 28

                                                
                                                
** /stderr **
addons_test.go:282: failed to get expected response from http://127.0.0.1/ within minikube: exit status 1
addons_test.go:290: (dbg) Run:  kubectl --context addons-399099 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:295: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 ip
addons_test.go:301: (dbg) Run:  nslookup hello-john.test 192.168.49.2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestAddons/parallel/Ingress]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestAddons/parallel/Ingress]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect addons-399099
helpers_test.go:244: (dbg) docker inspect addons-399099:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3",
	        "Created": "2025-12-18T00:12:45.653994198Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1160961,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:12:45.715729745Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/hostname",
	        "HostsPath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/hosts",
	        "LogPath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3-json.log",
	        "Name": "/addons-399099",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "addons-399099:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "addons-399099",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3",
	                "LowerDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c/merged",
	                "UpperDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c/diff",
	                "WorkDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "addons-399099",
	                "Source": "/var/lib/docker/volumes/addons-399099/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "addons-399099",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "addons-399099",
	                "name.minikube.sigs.k8s.io": "addons-399099",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "96bfedaa930e5289f84629700928b9067f16cc86feaf5d8687fda240201d7ae8",
	            "SandboxKey": "/var/run/docker/netns/96bfedaa930e",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33910"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33911"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33914"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33912"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33913"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "addons-399099": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "ce:64:2e:10:5f:9e",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "b6eb769954854278d027b4718b55a4f007c530da20340571549bfde161f35973",
	                    "EndpointID": "c19b4d95c5ba48737dc0c8876e34e71428f125165d4d38827642582fc8c51bfd",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "addons-399099",
	                        "deedfeeb0da0"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p addons-399099 -n addons-399099
helpers_test.go:253: <<< TestAddons/parallel/Ingress FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestAddons/parallel/Ingress]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p addons-399099 logs -n 25: (1.528563193s)
helpers_test.go:261: TestAddons/parallel/Ingress logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                                                                                                                                                                   ARGS                                                                                                                                                                                                                                   │        PROFILE         │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ delete  │ -p download-docker-540812                                                                                                                                                                                                                                                                                                                                                                                                                                                │ download-docker-540812 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ --download-only -p binary-mirror-966047 --alsologtostderr --binary-mirror http://127.0.0.1:45057 --driver=docker  --container-runtime=crio                                                                                                                                                                                                                                                                                                                               │ binary-mirror-966047   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ delete  │ -p binary-mirror-966047                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ binary-mirror-966047   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ addons  │ enable dashboard -p addons-399099                                                                                                                                                                                                                                                                                                                                                                                                                                        │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ addons  │ disable dashboard -p addons-399099                                                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ start   │ -p addons-399099 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=crio --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:14 UTC │
	│ addons  │ addons-399099 addons disable volcano --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                              │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ addons  │ addons-399099 addons disable gcp-auth --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ addons  │ enable headlamp -p addons-399099 --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                  │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ addons  │ addons-399099 addons disable headlamp --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ ip      │ addons-399099 ip                                                                                                                                                                                                                                                                                                                                                                                                                                                         │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │ 18 Dec 25 00:14 UTC │
	│ addons  │ addons-399099 addons disable registry --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ addons  │ addons-399099 addons disable metrics-server --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ addons-399099 addons disable inspektor-gadget --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                     │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ ssh     │ addons-399099 ssh curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'                                                                                                                                                                                                                                                                                                                                                                                                 │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ addons-399099 addons disable volumesnapshots --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                      │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ addons-399099 addons disable csi-hostpath-driver --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                  │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ configure registry-creds -f ./testdata/addons_testconfig.json -p addons-399099                                                                                                                                                                                                                                                                                                                                                                                           │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │ 18 Dec 25 00:15 UTC │
	│ addons  │ addons-399099 addons disable registry-creds --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ ssh     │ addons-399099 ssh cat /opt/local-path-provisioner/pvc-e70fd2c3-45de-49ac-a61e-78ca412545ff_default_test-pvc/file1                                                                                                                                                                                                                                                                                                                                                        │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │ 18 Dec 25 00:15 UTC │
	│ addons  │ addons-399099 addons disable storage-provisioner-rancher --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                          │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ addons-399099 addons disable yakd --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                 │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ addons-399099 addons disable nvidia-device-plugin --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                 │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:15 UTC │                     │
	│ addons  │ addons-399099 addons disable cloud-spanner --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                        │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:16 UTC │                     │
	│ ip      │ addons-399099 ip                                                                                                                                                                                                                                                                                                                                                                                                                                                         │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:12:20
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:12:20.984924 1160558 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:12:20.985091 1160558 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:20.985121 1160558 out.go:374] Setting ErrFile to fd 2...
	I1218 00:12:20.985143 1160558 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:20.985408 1160558 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:12:20.985906 1160558 out.go:368] Setting JSON to false
	I1218 00:12:20.986702 1160558 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":24889,"bootTime":1765991852,"procs":145,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:12:20.986800 1160558 start.go:143] virtualization:  
	I1218 00:12:20.990266 1160558 out.go:179] * [addons-399099] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:12:20.994161 1160558 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:12:20.994217 1160558 notify.go:221] Checking for updates...
	I1218 00:12:20.999981 1160558 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:12:21.011998 1160558 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:12:21.014920 1160558 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:12:21.017882 1160558 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:12:21.020774 1160558 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:12:21.023900 1160558 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:12:21.048358 1160558 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:12:21.048496 1160558 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:21.114384 1160558 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-18 00:12:21.100124203 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:21.114502 1160558 docker.go:319] overlay module found
	I1218 00:12:21.117714 1160558 out.go:179] * Using the docker driver based on user configuration
	I1218 00:12:21.120516 1160558 start.go:309] selected driver: docker
	I1218 00:12:21.120540 1160558 start.go:927] validating driver "docker" against <nil>
	I1218 00:12:21.120554 1160558 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:12:21.121256 1160558 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:21.173103 1160558 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-18 00:12:21.164325274 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:21.173256 1160558 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 00:12:21.173491 1160558 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:12:21.176370 1160558 out.go:179] * Using Docker driver with root privileges
	I1218 00:12:21.179089 1160558 cni.go:84] Creating CNI manager for ""
	I1218 00:12:21.179158 1160558 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:12:21.179170 1160558 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 00:12:21.179249 1160558 start.go:353] cluster config:
	{Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime
:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:
AutoPauseInterval:1m0s}
	I1218 00:12:21.182318 1160558 out.go:179] * Starting "addons-399099" primary control-plane node in "addons-399099" cluster
	I1218 00:12:21.185089 1160558 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:12:21.187981 1160558 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:12:21.190799 1160558 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:12:21.190978 1160558 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:21.191004 1160558 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:12:21.191012 1160558 cache.go:65] Caching tarball of preloaded images
	I1218 00:12:21.191088 1160558 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:12:21.191104 1160558 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:12:21.191471 1160558 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/config.json ...
	I1218 00:12:21.191499 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/config.json: {Name:mka25bf273bdc24fbc031875fcf06423ccf24563 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:21.206938 1160558 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1218 00:12:21.207083 1160558 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory
	I1218 00:12:21.207109 1160558 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory, skipping pull
	I1218 00:12:21.207114 1160558 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in cache, skipping pull
	I1218 00:12:21.207122 1160558 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 as a tarball
	I1218 00:12:21.207127 1160558 cache.go:176] Loading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 from local cache
	I1218 00:12:39.058986 1160558 cache.go:178] successfully loaded and using gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 from cached tarball
	I1218 00:12:39.059028 1160558 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:12:39.059067 1160558 start.go:360] acquireMachinesLock for addons-399099: {Name:mkf472e05bf018f075f6ec92cb001b01a2413843 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:12:39.059202 1160558 start.go:364] duration metric: took 111.406µs to acquireMachinesLock for "addons-399099"
	I1218 00:12:39.059238 1160558 start.go:93] Provisioning new machine with config: &{Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:min
ikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath:
SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:12:39.059318 1160558 start.go:125] createHost starting for "" (driver="docker")
	I1218 00:12:39.062874 1160558 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	I1218 00:12:39.063125 1160558 start.go:159] libmachine.API.Create for "addons-399099" (driver="docker")
	I1218 00:12:39.063166 1160558 client.go:173] LocalClient.Create starting
	I1218 00:12:39.063292 1160558 main.go:143] libmachine: Creating CA: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem
	I1218 00:12:39.196682 1160558 main.go:143] libmachine: Creating client certificate: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem
	I1218 00:12:39.459233 1160558 cli_runner.go:164] Run: docker network inspect addons-399099 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1218 00:12:39.475616 1160558 cli_runner.go:211] docker network inspect addons-399099 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1218 00:12:39.475719 1160558 network_create.go:284] running [docker network inspect addons-399099] to gather additional debugging logs...
	I1218 00:12:39.475744 1160558 cli_runner.go:164] Run: docker network inspect addons-399099
	W1218 00:12:39.491777 1160558 cli_runner.go:211] docker network inspect addons-399099 returned with exit code 1
	I1218 00:12:39.491810 1160558 network_create.go:287] error running [docker network inspect addons-399099]: docker network inspect addons-399099: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network addons-399099 not found
	I1218 00:12:39.491824 1160558 network_create.go:289] output of [docker network inspect addons-399099]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network addons-399099 not found
	
	** /stderr **
	I1218 00:12:39.491956 1160558 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:12:39.508539 1160558 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001983900}
	I1218 00:12:39.508584 1160558 network_create.go:124] attempt to create docker network addons-399099 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1218 00:12:39.508639 1160558 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=addons-399099 addons-399099
	I1218 00:12:39.570642 1160558 network_create.go:108] docker network addons-399099 192.168.49.0/24 created
	I1218 00:12:39.570675 1160558 kic.go:121] calculated static IP "192.168.49.2" for the "addons-399099" container
	I1218 00:12:39.570747 1160558 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1218 00:12:39.585760 1160558 cli_runner.go:164] Run: docker volume create addons-399099 --label name.minikube.sigs.k8s.io=addons-399099 --label created_by.minikube.sigs.k8s.io=true
	I1218 00:12:39.603687 1160558 oci.go:103] Successfully created a docker volume addons-399099
	I1218 00:12:39.603780 1160558 cli_runner.go:164] Run: docker run --rm --name addons-399099-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-399099 --entrypoint /usr/bin/test -v addons-399099:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib
	I1218 00:12:41.586921 1160558 cli_runner.go:217] Completed: docker run --rm --name addons-399099-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-399099 --entrypoint /usr/bin/test -v addons-399099:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib: (1.983100665s)
	I1218 00:12:41.586954 1160558 oci.go:107] Successfully prepared a docker volume addons-399099
	I1218 00:12:41.587002 1160558 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:41.587017 1160558 kic.go:194] Starting extracting preloaded images to volume ...
	I1218 00:12:41.587085 1160558 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v addons-399099:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir
	I1218 00:12:45.575570 1160558 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v addons-399099:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir: (3.988446202s)
	I1218 00:12:45.575602 1160558 kic.go:203] duration metric: took 3.988582321s to extract preloaded images to volume ...
	W1218 00:12:45.575745 1160558 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1218 00:12:45.575863 1160558 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1218 00:12:45.637562 1160558 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname addons-399099 --name addons-399099 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-399099 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=addons-399099 --network addons-399099 --ip 192.168.49.2 --volume addons-399099:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0
	I1218 00:12:45.928846 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Running}}
	I1218 00:12:45.954902 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:12:45.981614 1160558 cli_runner.go:164] Run: docker exec addons-399099 stat /var/lib/dpkg/alternatives/iptables
	I1218 00:12:46.048704 1160558 oci.go:144] the created container "addons-399099" has a running status.
	I1218 00:12:46.048736 1160558 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa...
	I1218 00:12:46.363025 1160558 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1218 00:12:46.389900 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:12:46.423930 1160558 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1218 00:12:46.423949 1160558 kic_runner.go:114] Args: [docker exec --privileged addons-399099 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1218 00:12:46.511600 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:12:46.535966 1160558 machine.go:94] provisionDockerMachine start ...
	I1218 00:12:46.536065 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:46.564128 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:46.564718 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:46.564735 1160558 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:12:46.565455 1160558 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1218 00:12:49.723694 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: addons-399099
	
	I1218 00:12:49.723719 1160558 ubuntu.go:182] provisioning hostname "addons-399099"
	I1218 00:12:49.723784 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:49.741190 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:49.741517 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:49.741533 1160558 main.go:143] libmachine: About to run SSH command:
	sudo hostname addons-399099 && echo "addons-399099" | sudo tee /etc/hostname
	I1218 00:12:49.901181 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: addons-399099
	
	I1218 00:12:49.901265 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:49.919175 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:49.919483 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:49.919498 1160558 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\saddons-399099' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 addons-399099/g' /etc/hosts;
				else 
					echo '127.0.1.1 addons-399099' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:12:50.072522 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:12:50.072548 1160558 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:12:50.072581 1160558 ubuntu.go:190] setting up certificates
	I1218 00:12:50.072601 1160558 provision.go:84] configureAuth start
	I1218 00:12:50.072675 1160558 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-399099
	I1218 00:12:50.090136 1160558 provision.go:143] copyHostCerts
	I1218 00:12:50.090228 1160558 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:12:50.090368 1160558 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:12:50.090428 1160558 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:12:50.090485 1160558 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.addons-399099 san=[127.0.0.1 192.168.49.2 addons-399099 localhost minikube]
	I1218 00:12:50.250433 1160558 provision.go:177] copyRemoteCerts
	I1218 00:12:50.250499 1160558 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:12:50.250538 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.267324 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:50.371734 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I1218 00:12:50.388330 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:12:50.405058 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:12:50.421976 1160558 provision.go:87] duration metric: took 349.345812ms to configureAuth
	I1218 00:12:50.422055 1160558 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:12:50.422272 1160558 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:12:50.422404 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.439398 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:50.439713 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:50.439726 1160558 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:12:50.751149 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:12:50.751174 1160558 machine.go:97] duration metric: took 4.215187549s to provisionDockerMachine
	I1218 00:12:50.751186 1160558 client.go:176] duration metric: took 11.688009231s to LocalClient.Create
	I1218 00:12:50.751199 1160558 start.go:167] duration metric: took 11.688075633s to libmachine.API.Create "addons-399099"
	I1218 00:12:50.751206 1160558 start.go:293] postStartSetup for "addons-399099" (driver="docker")
	I1218 00:12:50.751216 1160558 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:12:50.751286 1160558 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:12:50.751329 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.777333 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:50.883963 1160558 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:12:50.887177 1160558 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:12:50.887204 1160558 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:12:50.887215 1160558 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:12:50.887279 1160558 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:12:50.887307 1160558 start.go:296] duration metric: took 136.094827ms for postStartSetup
	I1218 00:12:50.887628 1160558 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-399099
	I1218 00:12:50.903943 1160558 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/config.json ...
	I1218 00:12:50.904418 1160558 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:12:50.904478 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.920969 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:51.025589 1160558 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:12:51.030352 1160558 start.go:128] duration metric: took 11.971017128s to createHost
	I1218 00:12:51.030377 1160558 start.go:83] releasing machines lock for "addons-399099", held for 11.971162198s
	I1218 00:12:51.030447 1160558 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-399099
	I1218 00:12:51.048173 1160558 ssh_runner.go:195] Run: cat /version.json
	I1218 00:12:51.048277 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:51.048627 1160558 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:12:51.048700 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:51.071640 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:51.075457 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:51.179876 1160558 ssh_runner.go:195] Run: systemctl --version
	I1218 00:12:51.270909 1160558 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:12:51.305372 1160558 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:12:51.309672 1160558 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:12:51.309744 1160558 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:12:51.336895 1160558 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1218 00:12:51.336919 1160558 start.go:496] detecting cgroup driver to use...
	I1218 00:12:51.336952 1160558 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:12:51.337001 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:12:51.353834 1160558 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:12:51.366489 1160558 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:12:51.366556 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:12:51.384345 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:12:51.403492 1160558 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:12:51.538040 1160558 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:12:51.671453 1160558 docker.go:234] disabling docker service ...
	I1218 00:12:51.671517 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:12:51.694900 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:12:51.707489 1160558 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:12:51.841376 1160558 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:12:51.963812 1160558 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:12:51.976097 1160558 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:12:51.989106 1160558 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:12:51.989215 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:51.997295 1160558 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:12:51.997412 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.007851 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.017826 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.026949 1160558 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:12:52.036059 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.044833 1160558 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.059581 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.068949 1160558 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:12:52.077280 1160558 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:12:52.084832 1160558 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:12:52.205686 1160558 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:12:52.367991 1160558 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:12:52.368077 1160558 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:12:52.371658 1160558 start.go:564] Will wait 60s for crictl version
	I1218 00:12:52.371723 1160558 ssh_runner.go:195] Run: which crictl
	I1218 00:12:52.375119 1160558 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:12:52.401626 1160558 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:12:52.401725 1160558 ssh_runner.go:195] Run: crio --version
	I1218 00:12:52.431262 1160558 ssh_runner.go:195] Run: crio --version
	I1218 00:12:52.462900 1160558 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:12:52.465795 1160558 cli_runner.go:164] Run: docker network inspect addons-399099 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:12:52.482740 1160558 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:12:52.486480 1160558 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 00:12:52.496049 1160558 kubeadm.go:884] updating cluster {Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNa
mes:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketV
MnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:12:52.496178 1160558 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:52.496257 1160558 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:12:52.530233 1160558 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:12:52.530256 1160558 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:12:52.530311 1160558 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:12:52.553538 1160558 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:12:52.553560 1160558 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:12:52.553569 1160558 kubeadm.go:935] updating node { 192.168.49.2 8443 v1.34.3 crio true true} ...
	I1218 00:12:52.553700 1160558 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=addons-399099 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:12:52.553782 1160558 ssh_runner.go:195] Run: crio config
	I1218 00:12:52.624851 1160558 cni.go:84] Creating CNI manager for ""
	I1218 00:12:52.624871 1160558 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:12:52.624889 1160558 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:12:52.624934 1160558 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8443 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:addons-399099 NodeName:addons-399099 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kuberne
tes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:12:52.625107 1160558 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "addons-399099"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:12:52.625184 1160558 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:12:52.632518 1160558 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:12:52.632611 1160558 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:12:52.639991 1160558 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (363 bytes)
	I1218 00:12:52.652028 1160558 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:12:52.664668 1160558 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2210 bytes)
	I1218 00:12:52.676343 1160558 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:12:52.679659 1160558 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 00:12:52.688914 1160558 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:12:52.811066 1160558 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:12:52.827386 1160558 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099 for IP: 192.168.49.2
	I1218 00:12:52.827405 1160558 certs.go:195] generating shared ca certs ...
	I1218 00:12:52.827420 1160558 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:52.827548 1160558 certs.go:241] generating "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:12:53.092915 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt ...
	I1218 00:12:53.092948 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt: {Name:mk226e9dd5b352dedeaeb4a78738225ca3d6135a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.093145 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key ...
	I1218 00:12:53.093157 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key: {Name:mk671944f0854499fc6e3ec5d6820eacd490e2cf Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.093253 1160558 certs.go:241] generating "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:12:53.255736 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt ...
	I1218 00:12:53.255768 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt: {Name:mk4ade57f1513111ab4e4ce9561fbffb032cb5ab Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.255934 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key ...
	I1218 00:12:53.255950 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key: {Name:mkd33963319741682b24d6e4e71cc086455d2530 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.256027 1160558 certs.go:257] generating profile certs ...
	I1218 00:12:53.256087 1160558 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.key
	I1218 00:12:53.256105 1160558 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt with IP's: []
	I1218 00:12:53.400824 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt ...
	I1218 00:12:53.400859 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: {Name:mk352511a18cc8ba8ba10982fc22a75b8603ce38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.401023 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.key ...
	I1218 00:12:53.401041 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.key: {Name:mka05f935856a494c785edbeaf9a11144edf222c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.401118 1160558 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781
	I1218 00:12:53.401140 1160558 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1218 00:12:53.616403 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781 ...
	I1218 00:12:53.616431 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781: {Name:mka697dee47c7b6e4349ec95a83ee44198c7f8fd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.616596 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781 ...
	I1218 00:12:53.616609 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781: {Name:mk9858a1b6fc540bef1583e9a5cf3680480d63f3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.616687 1160558 certs.go:382] copying /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt
	I1218 00:12:53.616768 1160558 certs.go:386] copying /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key
	I1218 00:12:53.616822 1160558 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key
	I1218 00:12:53.616844 1160558 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt with IP's: []
	I1218 00:12:53.702402 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt ...
	I1218 00:12:53.702433 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt: {Name:mk6712bb91eec42828d26747cdd3175be72765a6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.702614 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key ...
	I1218 00:12:53.702628 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key: {Name:mk9824bd470321d26bfc0f189b1ee2d620ed19f9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.702823 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:12:53.702868 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:12:53.702894 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:12:53.702935 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:12:53.703493 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:12:53.722812 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:12:53.740139 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:12:53.757546 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:12:53.774411 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1218 00:12:53.791798 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:12:53.808743 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:12:53.826544 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:12:53.844509 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:12:53.861591 1160558 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:12:53.873985 1160558 ssh_runner.go:195] Run: openssl version
	I1218 00:12:53.880210 1160558 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.887475 1160558 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:12:53.894823 1160558 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.898452 1160558 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.898518 1160558 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.939206 1160558 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:12:53.946242 1160558 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1218 00:12:53.953253 1160558 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:12:53.956817 1160558 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1218 00:12:53.956868 1160558 kubeadm.go:401] StartCluster: {Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames
:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMne
tClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:12:53.956951 1160558 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:12:53.957011 1160558 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:12:53.982546 1160558 cri.go:89] found id: ""
	I1218 00:12:53.982615 1160558 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:12:53.990303 1160558 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:12:53.997965 1160558 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:12:53.998061 1160558 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:12:54.007130 1160558 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:12:54.007164 1160558 kubeadm.go:158] found existing configuration files:
	
	I1218 00:12:54.007236 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1218 00:12:54.016300 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:12:54.016440 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:12:54.024438 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1218 00:12:54.032378 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:12:54.032451 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:12:54.039970 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1218 00:12:54.048662 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:12:54.048756 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:12:54.056589 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1218 00:12:54.064522 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:12:54.064598 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:12:54.072168 1160558 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:12:54.114560 1160558 kubeadm.go:319] [init] Using Kubernetes version: v1.34.3
	I1218 00:12:54.114808 1160558 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:12:54.139197 1160558 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:12:54.139315 1160558 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:12:54.139380 1160558 kubeadm.go:319] OS: Linux
	I1218 00:12:54.139448 1160558 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:12:54.139519 1160558 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:12:54.139593 1160558 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:12:54.139670 1160558 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:12:54.139739 1160558 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:12:54.139811 1160558 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:12:54.139903 1160558 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:12:54.140000 1160558 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:12:54.140075 1160558 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:12:54.208462 1160558 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:12:54.208659 1160558 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:12:54.208804 1160558 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:12:54.215551 1160558 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:12:54.222075 1160558 out.go:252]   - Generating certificates and keys ...
	I1218 00:12:54.222167 1160558 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:12:54.222230 1160558 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:12:54.550604 1160558 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1218 00:12:54.889036 1160558 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1218 00:12:55.080946 1160558 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1218 00:12:55.261829 1160558 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1218 00:12:55.643453 1160558 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1218 00:12:55.643761 1160558 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [addons-399099 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1218 00:12:56.046339 1160558 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1218 00:12:56.046548 1160558 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [addons-399099 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1218 00:12:56.821447 1160558 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1218 00:12:58.046501 1160558 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1218 00:12:58.354674 1160558 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1218 00:12:58.354963 1160558 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:12:58.634419 1160558 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:12:59.164859 1160558 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:13:00.401909 1160558 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:13:00.623079 1160558 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:13:02.190430 1160558 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:13:02.191124 1160558 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:13:02.193847 1160558 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:13:02.197226 1160558 out.go:252]   - Booting up control plane ...
	I1218 00:13:02.197323 1160558 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:13:02.197405 1160558 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:13:02.197473 1160558 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:13:02.212130 1160558 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:13:02.212276 1160558 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:13:02.220546 1160558 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:13:02.220827 1160558 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:13:02.221014 1160558 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:13:02.367718 1160558 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:13:02.367845 1160558 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:13:02.871530 1160558 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 501.745896ms
	I1218 00:13:02.873845 1160558 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1218 00:13:02.874104 1160558 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.49.2:8443/livez
	I1218 00:13:02.874199 1160558 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1218 00:13:02.874277 1160558 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1218 00:13:05.937033 1160558 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 3.061615159s
	I1218 00:13:07.416710 1160558 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 4.541717198s
	I1218 00:13:08.876605 1160558 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 6.00134908s
	I1218 00:13:08.910569 1160558 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1218 00:13:08.924720 1160558 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1218 00:13:08.938970 1160558 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1218 00:13:08.939170 1160558 kubeadm.go:319] [mark-control-plane] Marking the node addons-399099 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1218 00:13:08.950640 1160558 kubeadm.go:319] [bootstrap-token] Using token: 2958n9.ll17if4da92gcu4g
	I1218 00:13:08.953625 1160558 out.go:252]   - Configuring RBAC rules ...
	I1218 00:13:08.953755 1160558 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1218 00:13:08.957992 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1218 00:13:08.967908 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1218 00:13:08.974957 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1218 00:13:08.979252 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1218 00:13:08.983385 1160558 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1218 00:13:09.284621 1160558 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1218 00:13:09.716822 1160558 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1218 00:13:10.284367 1160558 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1218 00:13:10.285505 1160558 kubeadm.go:319] 
	I1218 00:13:10.285581 1160558 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1218 00:13:10.285586 1160558 kubeadm.go:319] 
	I1218 00:13:10.285659 1160558 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1218 00:13:10.285663 1160558 kubeadm.go:319] 
	I1218 00:13:10.285687 1160558 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1218 00:13:10.285793 1160558 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1218 00:13:10.285845 1160558 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1218 00:13:10.285849 1160558 kubeadm.go:319] 
	I1218 00:13:10.285909 1160558 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1218 00:13:10.285914 1160558 kubeadm.go:319] 
	I1218 00:13:10.285984 1160558 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1218 00:13:10.285999 1160558 kubeadm.go:319] 
	I1218 00:13:10.286125 1160558 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1218 00:13:10.286224 1160558 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1218 00:13:10.286331 1160558 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1218 00:13:10.286340 1160558 kubeadm.go:319] 
	I1218 00:13:10.286440 1160558 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1218 00:13:10.286540 1160558 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1218 00:13:10.286550 1160558 kubeadm.go:319] 
	I1218 00:13:10.286657 1160558 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token 2958n9.ll17if4da92gcu4g \
	I1218 00:13:10.286819 1160558 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:18af86f6ab3e0657d733c8936184202396957856d244f2643507ab37d928e53b \
	I1218 00:13:10.286846 1160558 kubeadm.go:319] 	--control-plane 
	I1218 00:13:10.286855 1160558 kubeadm.go:319] 
	I1218 00:13:10.286966 1160558 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1218 00:13:10.286975 1160558 kubeadm.go:319] 
	I1218 00:13:10.287076 1160558 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token 2958n9.ll17if4da92gcu4g \
	I1218 00:13:10.287227 1160558 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:18af86f6ab3e0657d733c8936184202396957856d244f2643507ab37d928e53b 
	I1218 00:13:10.290707 1160558 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1218 00:13:10.291016 1160558 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:13:10.291162 1160558 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:13:10.291193 1160558 cni.go:84] Creating CNI manager for ""
	I1218 00:13:10.291217 1160558 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:13:10.296432 1160558 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1218 00:13:10.299477 1160558 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1218 00:13:10.303965 1160558 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.3/kubectl ...
	I1218 00:13:10.303987 1160558 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2620 bytes)
	I1218 00:13:10.319641 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1218 00:13:10.607349 1160558 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1218 00:13:10.607500 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:10.607583 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes addons-399099 minikube.k8s.io/updated_at=2025_12_18T00_13_10_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=2e96f676eb7e96389e85fe0658a4ede4c4ba6924 minikube.k8s.io/name=addons-399099 minikube.k8s.io/primary=true
	I1218 00:13:10.744966 1160558 ops.go:34] apiserver oom_adj: -16
	I1218 00:13:10.745096 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:11.246120 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:11.745204 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:12.245453 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:12.745201 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:13.245235 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:13.745703 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:14.245126 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:14.745440 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:14.866125 1160558 kubeadm.go:1114] duration metric: took 4.258692816s to wait for elevateKubeSystemPrivileges
	I1218 00:13:14.866156 1160558 kubeadm.go:403] duration metric: took 20.909291982s to StartCluster
	I1218 00:13:14.866173 1160558 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:13:14.866294 1160558 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:13:14.866715 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:13:14.866897 1160558 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:13:14.867070 1160558 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1218 00:13:14.867320 1160558 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:13:14.867348 1160558 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:true auto-pause:false cloud-spanner:true csi-hostpath-driver:true dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:true gvisor:false headlamp:false inaccel:false ingress:true ingress-dns:true inspektor-gadget:true istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:true nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:true registry-aliases:false registry-creds:true storage-provisioner:true storage-provisioner-rancher:true volcano:true volumesnapshots:true yakd:true]
	I1218 00:13:14.867432 1160558 addons.go:70] Setting yakd=true in profile "addons-399099"
	I1218 00:13:14.867446 1160558 addons.go:239] Setting addon yakd=true in "addons-399099"
	I1218 00:13:14.867467 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.867949 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.868493 1160558 addons.go:70] Setting metrics-server=true in profile "addons-399099"
	I1218 00:13:14.868510 1160558 addons.go:239] Setting addon metrics-server=true in "addons-399099"
	I1218 00:13:14.868534 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.868935 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.869175 1160558 addons.go:70] Setting nvidia-device-plugin=true in profile "addons-399099"
	I1218 00:13:14.869206 1160558 addons.go:239] Setting addon nvidia-device-plugin=true in "addons-399099"
	I1218 00:13:14.869254 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.870516 1160558 addons.go:70] Setting registry=true in profile "addons-399099"
	I1218 00:13:14.870533 1160558 addons.go:239] Setting addon registry=true in "addons-399099"
	I1218 00:13:14.870553 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.871058 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.871952 1160558 addons.go:70] Setting registry-creds=true in profile "addons-399099"
	I1218 00:13:14.871997 1160558 addons.go:239] Setting addon registry-creds=true in "addons-399099"
	I1218 00:13:14.872046 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.872611 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.873356 1160558 addons.go:70] Setting amd-gpu-device-plugin=true in profile "addons-399099"
	I1218 00:13:14.873439 1160558 addons.go:239] Setting addon amd-gpu-device-plugin=true in "addons-399099"
	I1218 00:13:14.873510 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.874249 1160558 addons.go:70] Setting cloud-spanner=true in profile "addons-399099"
	I1218 00:13:14.874267 1160558 addons.go:239] Setting addon cloud-spanner=true in "addons-399099"
	I1218 00:13:14.874296 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.874862 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.880033 1160558 addons.go:70] Setting storage-provisioner=true in profile "addons-399099"
	I1218 00:13:14.880107 1160558 addons.go:239] Setting addon storage-provisioner=true in "addons-399099"
	I1218 00:13:14.880154 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.880733 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.888655 1160558 addons.go:70] Setting storage-provisioner-rancher=true in profile "addons-399099"
	I1218 00:13:14.888682 1160558 addons_storage_classes.go:34] enableOrDisableStorageClasses storage-provisioner-rancher=true on "addons-399099"
	I1218 00:13:14.889003 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.890214 1160558 addons.go:70] Setting csi-hostpath-driver=true in profile "addons-399099"
	I1218 00:13:14.890277 1160558 addons.go:239] Setting addon csi-hostpath-driver=true in "addons-399099"
	I1218 00:13:14.890303 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.890725 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.899984 1160558 addons.go:70] Setting volcano=true in profile "addons-399099"
	I1218 00:13:14.900015 1160558 addons.go:239] Setting addon volcano=true in "addons-399099"
	I1218 00:13:14.900102 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.900744 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.902412 1160558 addons.go:70] Setting default-storageclass=true in profile "addons-399099"
	I1218 00:13:14.902437 1160558 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "addons-399099"
	I1218 00:13:14.902884 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.920361 1160558 addons.go:70] Setting volumesnapshots=true in profile "addons-399099"
	I1218 00:13:14.920392 1160558 addons.go:239] Setting addon volumesnapshots=true in "addons-399099"
	I1218 00:13:14.920426 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.920895 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.921064 1160558 addons.go:70] Setting gcp-auth=true in profile "addons-399099"
	I1218 00:13:14.921083 1160558 mustload.go:66] Loading cluster: addons-399099
	I1218 00:13:14.921258 1160558 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:13:14.921478 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.937180 1160558 addons.go:70] Setting ingress=true in profile "addons-399099"
	I1218 00:13:14.937212 1160558 addons.go:239] Setting addon ingress=true in "addons-399099"
	I1218 00:13:14.937252 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.937817 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.940660 1160558 out.go:179]   - Using image docker.io/marcnuri/yakd:0.0.6
	I1218 00:13:14.944600 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-ns.yaml
	I1218 00:13:14.944626 1160558 ssh_runner.go:362] scp yakd/yakd-ns.yaml --> /etc/kubernetes/addons/yakd-ns.yaml (171 bytes)
	I1218 00:13:14.944691 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:14.964068 1160558 addons.go:70] Setting ingress-dns=true in profile "addons-399099"
	I1218 00:13:14.964103 1160558 addons.go:239] Setting addon ingress-dns=true in "addons-399099"
	I1218 00:13:14.964159 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.964768 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.971787 1160558 out.go:179] * Verifying Kubernetes components...
	I1218 00:13:14.975756 1160558 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:13:14.975890 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.980937 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.018087 1160558 addons.go:70] Setting inspektor-gadget=true in profile "addons-399099"
	I1218 00:13:15.018129 1160558 addons.go:239] Setting addon inspektor-gadget=true in "addons-399099"
	I1218 00:13:15.018179 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.018696 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.059598 1160558 out.go:179]   - Using image registry.k8s.io/metrics-server/metrics-server:v0.8.0
	I1218 00:13:15.063266 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I1218 00:13:15.063298 1160558 ssh_runner.go:362] scp metrics-server/metrics-apiservice.yaml --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I1218 00:13:15.063367 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.113619 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.138871 1160558 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:13:15.145504 1160558 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:13:15.145572 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:13:15.145678 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.164197 1160558 addons.go:239] Setting addon storage-provisioner-rancher=true in "addons-399099"
	I1218 00:13:15.164302 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.164735 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.166589 1160558 out.go:179]   - Using image nvcr.io/nvidia/k8s-device-plugin:v0.18.1
	I1218 00:13:15.167357 1160558 out.go:179]   - Using image docker.io/registry:3.0.0
	I1218 00:13:15.183044 1160558 out.go:179]   - Using image gcr.io/cloud-spanner-emulator/emulator:1.5.45
	I1218 00:13:15.186532 1160558 addons.go:436] installing /etc/kubernetes/addons/deployment.yaml
	I1218 00:13:15.186557 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/deployment.yaml (1004 bytes)
	I1218 00:13:15.186621 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.207119 1160558 out.go:179]   - Using image gcr.io/k8s-minikube/kube-registry-proxy:0.0.9
	I1218 00:13:15.208083 1160558 out.go:179]   - Using image docker.io/upmcenterprises/registry-creds:1.10
	I1218 00:13:15.208327 1160558 addons.go:436] installing /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I1218 00:13:15.208350 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/nvidia-device-plugin.yaml (1966 bytes)
	I1218 00:13:15.208420 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.212082 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.214707 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0
	I1218 00:13:15.215886 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-creds-rc.yaml
	I1218 00:13:15.215902 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-creds-rc.yaml (3306 bytes)
	I1218 00:13:15.215958 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.246310 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-rc.yaml
	I1218 00:13:15.246330 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-rc.yaml (860 bytes)
	I1218 00:13:15.246390 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.255949 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/snapshot-controller:v6.1.0
	I1218 00:13:15.256340 1160558 out.go:179]   - Using image docker.io/rocm/k8s-device-plugin:1.25.2.8
	I1218 00:13:15.277823 1160558 addons.go:436] installing /etc/kubernetes/addons/amd-gpu-device-plugin.yaml
	I1218 00:13:15.288419 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/amd-gpu-device-plugin.yaml (1868 bytes)
	I1218 00:13:15.288587 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.293997 1160558 addons.go:239] Setting addon default-storageclass=true in "addons-399099"
	I1218 00:13:15.294038 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.294538 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.310323 1160558 out.go:179]   - Using image docker.io/kicbase/minikube-ingress-dns:0.0.4
	I1218 00:13:15.315525 1160558 addons.go:436] installing /etc/kubernetes/addons/ingress-dns-pod.yaml
	I1218 00:13:15.315548 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-dns-pod.yaml (2889 bytes)
	I1218 00:13:15.315627 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.323609 1160558 out.go:179]   - Using image ghcr.io/inspektor-gadget/inspektor-gadget:v0.47.0
	W1218 00:13:15.282436 1160558 out.go:285] ! Enabling 'volcano' returned an error: running callbacks: [volcano addon does not support crio]
	I1218 00:13:15.323752 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-provisioner:v3.3.0
	I1218 00:13:15.323835 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/controller:v1.14.1
	I1218 00:13:15.323990 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml
	I1218 00:13:15.325886 1160558 ssh_runner.go:362] scp volumesnapshots/csi-hostpath-snapshotclass.yaml --> /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml (934 bytes)
	I1218 00:13:15.325957 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.347914 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-attacher:v4.0.0
	I1218 00:13:15.356213 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-external-health-monitor-controller:v0.7.0
	I1218 00:13:15.360936 1160558 addons.go:436] installing /etc/kubernetes/addons/ig-deployment.yaml
	I1218 00:13:15.360962 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-deployment.yaml (15034 bytes)
	I1218 00:13:15.361028 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.361517 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.373183 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.6.0
	I1218 00:13:15.373339 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.5
	I1218 00:13:15.373543 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.381017 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.5
	I1218 00:13:15.381208 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/hostpathplugin:v1.9.0
	I1218 00:13:15.384952 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/livenessprobe:v2.8.0
	I1218 00:13:15.387861 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-resizer:v1.6.0
	I1218 00:13:15.390743 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-attacher.yaml
	I1218 00:13:15.390762 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-attacher.yaml --> /etc/kubernetes/addons/rbac-external-attacher.yaml (3073 bytes)
	I1218 00:13:15.390956 1160558 addons.go:436] installing /etc/kubernetes/addons/ingress-deploy.yaml
	I1218 00:13:15.390992 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-deploy.yaml (16078 bytes)
	I1218 00:13:15.391084 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.400602 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.420479 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.426494 1160558 out.go:179]   - Using image docker.io/rancher/local-path-provisioner:v0.0.22
	I1218 00:13:15.428879 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.429675 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.432962 1160558 out.go:179]   - Using image docker.io/busybox:stable
	I1218 00:13:15.435979 1160558 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I1218 00:13:15.436001 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner-rancher.yaml (3113 bytes)
	I1218 00:13:15.436065 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.464979 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.501964 1160558 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:13:15.501984 1160558 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:13:15.502048 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.511905 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.512971 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.527508 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.555601 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.567823 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.568411 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.576935 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	W1218 00:13:15.577467 1160558 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1218 00:13:15.577489 1160558 retry.go:31] will retry after 136.553495ms: ssh: handshake failed: EOF
	I1218 00:13:15.591535 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	W1218 00:13:15.593042 1160558 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1218 00:13:15.593068 1160558 retry.go:31] will retry after 292.605626ms: ssh: handshake failed: EOF
	W1218 00:13:15.717522 1160558 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1218 00:13:15.717596 1160558 retry.go:31] will retry after 415.307284ms: ssh: handshake failed: EOF
	I1218 00:13:15.832749 1160558 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:13:15.833011 1160558 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1218 00:13:16.128489 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-sa.yaml
	I1218 00:13:16.128515 1160558 ssh_runner.go:362] scp yakd/yakd-sa.yaml --> /etc/kubernetes/addons/yakd-sa.yaml (247 bytes)
	I1218 00:13:16.311097 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-crb.yaml
	I1218 00:13:16.311121 1160558 ssh_runner.go:362] scp yakd/yakd-crb.yaml --> /etc/kubernetes/addons/yakd-crb.yaml (422 bytes)
	I1218 00:13:16.344708 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ig-deployment.yaml
	I1218 00:13:16.375410 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I1218 00:13:16.375431 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1907 bytes)
	I1218 00:13:16.433483 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I1218 00:13:16.479377 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/deployment.yaml
	I1218 00:13:16.492437 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:13:16.571673 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I1218 00:13:16.577225 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-hostpath.yaml
	I1218 00:13:16.577251 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-hostpath.yaml --> /etc/kubernetes/addons/rbac-hostpath.yaml (4266 bytes)
	I1218 00:13:16.654419 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-svc.yaml
	I1218 00:13:16.654445 1160558 ssh_runner.go:362] scp yakd/yakd-svc.yaml --> /etc/kubernetes/addons/yakd-svc.yaml (412 bytes)
	I1218 00:13:16.663713 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-creds-rc.yaml
	I1218 00:13:16.668168 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml
	I1218 00:13:16.683117 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/amd-gpu-device-plugin.yaml
	I1218 00:13:16.689100 1160558 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml
	I1218 00:13:16.689125 1160558 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotclasses.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml (6471 bytes)
	I1218 00:13:16.693522 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-svc.yaml
	I1218 00:13:16.693544 1160558 ssh_runner.go:362] scp registry/registry-svc.yaml --> /etc/kubernetes/addons/registry-svc.yaml (398 bytes)
	I1218 00:13:16.763631 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I1218 00:13:16.763654 1160558 ssh_runner.go:362] scp metrics-server/metrics-server-rbac.yaml --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I1218 00:13:16.786734 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-dp.yaml
	I1218 00:13:16.786758 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-dp.yaml (2017 bytes)
	I1218 00:13:16.833328 1160558 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml
	I1218 00:13:16.833353 1160558 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotcontents.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml (23126 bytes)
	I1218 00:13:16.897988 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-proxy.yaml
	I1218 00:13:16.898011 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-proxy.yaml (947 bytes)
	I1218 00:13:17.050754 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml
	I1218 00:13:17.050779 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-health-monitor-controller.yaml --> /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml (3038 bytes)
	I1218 00:13:17.090411 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:13:17.098957 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml
	I1218 00:13:17.141139 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I1218 00:13:17.141167 1160558 ssh_runner.go:362] scp metrics-server/metrics-server-service.yaml --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I1218 00:13:17.164133 1160558 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml
	I1218 00:13:17.164160 1160558 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshots.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml (19582 bytes)
	I1218 00:13:17.209778 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml
	I1218 00:13:17.250996 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml
	I1218 00:13:17.289090 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-provisioner.yaml
	I1218 00:13:17.289116 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-provisioner.yaml --> /etc/kubernetes/addons/rbac-external-provisioner.yaml (4442 bytes)
	I1218 00:13:17.373114 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I1218 00:13:17.415857 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml
	I1218 00:13:17.415883 1160558 ssh_runner.go:362] scp volumesnapshots/rbac-volume-snapshot-controller.yaml --> /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml (3545 bytes)
	I1218 00:13:17.580365 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-resizer.yaml
	I1218 00:13:17.580394 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-resizer.yaml --> /etc/kubernetes/addons/rbac-external-resizer.yaml (2943 bytes)
	I1218 00:13:17.666250 1160558 addons.go:436] installing /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1218 00:13:17.666273 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml (1475 bytes)
	I1218 00:13:17.739054 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-snapshotter.yaml
	I1218 00:13:17.739080 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-snapshotter.yaml --> /etc/kubernetes/addons/rbac-external-snapshotter.yaml (3149 bytes)
	I1218 00:13:17.785843 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-attacher.yaml
	I1218 00:13:17.785868 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-attacher.yaml (2143 bytes)
	I1218 00:13:18.134700 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1218 00:13:18.144384 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml
	I1218 00:13:18.144409 1160558 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-driverinfo.yaml --> /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml (1274 bytes)
	I1218 00:13:18.482258 1160558 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (2.649199057s)
	I1218 00:13:18.482289 1160558 start.go:977] {"host.minikube.internal": 192.168.49.1} host record injected into CoreDNS's ConfigMap
	I1218 00:13:18.483233 1160558 ssh_runner.go:235] Completed: sudo systemctl start kubelet: (2.650397883s)
	I1218 00:13:18.483964 1160558 node_ready.go:35] waiting up to 6m0s for node "addons-399099" to be "Ready" ...
	I1218 00:13:18.543606 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-plugin.yaml
	I1218 00:13:18.543667 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-plugin.yaml (8201 bytes)
	I1218 00:13:18.762791 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-resizer.yaml
	I1218 00:13:18.762858 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-resizer.yaml (2191 bytes)
	I1218 00:13:18.986277 1160558 kapi.go:214] "coredns" deployment in "kube-system" namespace and "addons-399099" context rescaled to 1 replicas
	I1218 00:13:19.039721 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I1218 00:13:19.039743 1160558 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-storageclass.yaml --> /etc/kubernetes/addons/csi-hostpath-storageclass.yaml (846 bytes)
	I1218 00:13:19.056976 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	W1218 00:13:20.525538 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:20.678106 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ig-deployment.yaml: (4.333286912s)
	I1218 00:13:20.678159 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml: (4.24461104s)
	I1218 00:13:20.678198 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/deployment.yaml: (4.198798563s)
	I1218 00:13:20.932203 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (4.439731351s)
	I1218 00:13:21.058080 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml: (4.486368s)
	I1218 00:13:21.058334 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-creds-rc.yaml: (4.394592528s)
	I1218 00:13:21.352972 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml: (4.684764819s)
	I1218 00:13:21.353021 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/amd-gpu-device-plugin.yaml: (4.66987921s)
	I1218 00:13:21.353044 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (4.262611682s)
	I1218 00:13:21.420861 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml: (4.211015212s)
	I1218 00:13:21.420944 1160558 addons.go:495] Verifying addon registry=true in "addons-399099"
	I1218 00:13:21.420983 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml: (4.321997708s)
	I1218 00:13:21.426093 1160558 out.go:179] * To access YAKD - Kubernetes Dashboard, wait for Pod to be ready and run the following command:
	
		minikube -p addons-399099 service yakd-dashboard -n yakd-dashboard
	
	I1218 00:13:21.426210 1160558 out.go:179] * Verifying registry addon...
	I1218 00:13:21.429741 1160558 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=registry" in ns "kube-system" ...
	I1218 00:13:21.442753 1160558 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=registry
	I1218 00:13:21.442772 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:21.940323 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:22.179905 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (4.806744294s)
	I1218 00:13:22.179977 1160558 addons.go:495] Verifying addon metrics-server=true in "addons-399099"
	I1218 00:13:22.180094 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (4.045361931s)
	W1218 00:13:22.180134 1160558 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I1218 00:13:22.180163 1160558 retry.go:31] will retry after 152.467875ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I1218 00:13:22.180350 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml: (4.929323768s)
	I1218 00:13:22.180381 1160558 addons.go:495] Verifying addon ingress=true in "addons-399099"
	I1218 00:13:22.183637 1160558 out.go:179] * Verifying ingress addon...
	I1218 00:13:22.187116 1160558 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=ingress-nginx" in ns "ingress-nginx" ...
	I1218 00:13:22.194137 1160558 kapi.go:86] Found 3 Pods for label selector app.kubernetes.io/name=ingress-nginx
	I1218 00:13:22.194158 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:22.333679 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1218 00:13:22.438102 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:22.443837 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml: (3.386812222s)
	I1218 00:13:22.443868 1160558 addons.go:495] Verifying addon csi-hostpath-driver=true in "addons-399099"
	I1218 00:13:22.446780 1160558 out.go:179] * Verifying csi-hostpath-driver addon...
	I1218 00:13:22.450275 1160558 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
	I1218 00:13:22.465849 1160558 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I1218 00:13:22.465871 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:22.696482 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:22.872580 1160558 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_application_credentials.json (162 bytes)
	I1218 00:13:22.872686 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:22.888891 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:22.933942 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:22.953714 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	W1218 00:13:22.987698 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:23.028920 1160558 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_cloud_project (12 bytes)
	I1218 00:13:23.042429 1160558 addons.go:239] Setting addon gcp-auth=true in "addons-399099"
	I1218 00:13:23.042477 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:23.042956 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:23.059867 1160558 ssh_runner.go:195] Run: cat /var/lib/minikube/google_application_credentials.json
	I1218 00:13:23.059932 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:23.077686 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:23.190629 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:23.432619 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:23.453343 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:23.690783 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:23.933375 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:23.953167 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:24.191101 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:24.433617 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:24.453183 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:24.690896 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:24.933554 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:24.954039 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:25.052181 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (2.718457288s)
	I1218 00:13:25.052259 1160558 ssh_runner.go:235] Completed: cat /var/lib/minikube/google_application_credentials.json: (1.992366855s)
	I1218 00:13:25.055677 1160558 out.go:179]   - Using image gcr.io/k8s-minikube/gcp-auth-webhook:v0.1.3
	I1218 00:13:25.058563 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.5
	I1218 00:13:25.061445 1160558 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-ns.yaml
	I1218 00:13:25.061474 1160558 ssh_runner.go:362] scp gcp-auth/gcp-auth-ns.yaml --> /etc/kubernetes/addons/gcp-auth-ns.yaml (700 bytes)
	I1218 00:13:25.076075 1160558 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-service.yaml
	I1218 00:13:25.076098 1160558 ssh_runner.go:362] scp gcp-auth/gcp-auth-service.yaml --> /etc/kubernetes/addons/gcp-auth-service.yaml (788 bytes)
	I1218 00:13:25.090351 1160558 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I1218 00:13:25.090380 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-webhook.yaml (5421 bytes)
	I1218 00:13:25.104862 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I1218 00:13:25.191302 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:25.432994 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:25.454697 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	W1218 00:13:25.487792 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:25.612099 1160558 addons.go:495] Verifying addon gcp-auth=true in "addons-399099"
	I1218 00:13:25.615147 1160558 out.go:179] * Verifying gcp-auth addon...
	I1218 00:13:25.618778 1160558 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=gcp-auth" in ns "gcp-auth" ...
	I1218 00:13:25.627384 1160558 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I1218 00:13:25.627412 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:25.730152 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:25.933175 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:25.953785 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:26.122388 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:26.190141 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:26.432758 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:26.453492 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:26.621742 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:26.690987 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:26.932584 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:26.953143 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:27.122328 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:27.190037 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:27.433170 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:27.453989 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:27.622593 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:27.690423 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:27.933546 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:27.953274 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	W1218 00:13:27.986925 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:28.122107 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:28.191024 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:28.433235 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:28.453038 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:28.622130 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:28.723461 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:28.970714 1160558 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=registry
	I1218 00:13:28.970739 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:29.000901 1160558 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I1218 00:13:29.000930 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:29.008723 1160558 node_ready.go:49] node "addons-399099" is "Ready"
	I1218 00:13:29.008758 1160558 node_ready.go:38] duration metric: took 10.524766521s for node "addons-399099" to be "Ready" ...
	I1218 00:13:29.008772 1160558 api_server.go:52] waiting for apiserver process to appear ...
	I1218 00:13:29.008835 1160558 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:13:29.030227 1160558 api_server.go:72] duration metric: took 14.163302322s to wait for apiserver process to appear ...
	I1218 00:13:29.030257 1160558 api_server.go:88] waiting for apiserver healthz status ...
	I1218 00:13:29.030279 1160558 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8443/healthz ...
	I1218 00:13:29.073596 1160558 api_server.go:279] https://192.168.49.2:8443/healthz returned 200:
	ok
	I1218 00:13:29.079314 1160558 api_server.go:141] control plane version: v1.34.3
	I1218 00:13:29.079346 1160558 api_server.go:131] duration metric: took 49.080913ms to wait for apiserver health ...
	I1218 00:13:29.079356 1160558 system_pods.go:43] waiting for kube-system pods to appear ...
	I1218 00:13:29.108523 1160558 system_pods.go:59] 19 kube-system pods found
	I1218 00:13:29.108565 1160558 system_pods.go:61] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending
	I1218 00:13:29.108572 1160558 system_pods.go:61] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending
	I1218 00:13:29.108577 1160558 system_pods.go:61] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending
	I1218 00:13:29.108580 1160558 system_pods.go:61] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending
	I1218 00:13:29.108584 1160558 system_pods.go:61] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.108587 1160558 system_pods.go:61] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.108591 1160558 system_pods.go:61] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.108595 1160558 system_pods.go:61] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.108599 1160558 system_pods.go:61] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending
	I1218 00:13:29.108603 1160558 system_pods.go:61] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.108606 1160558 system_pods.go:61] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.108610 1160558 system_pods.go:61] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending
	I1218 00:13:29.108614 1160558 system_pods.go:61] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending
	I1218 00:13:29.108620 1160558 system_pods.go:61] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending
	I1218 00:13:29.108625 1160558 system_pods.go:61] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.108636 1160558 system_pods.go:61] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending
	I1218 00:13:29.108639 1160558 system_pods.go:61] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.108643 1160558 system_pods.go:61] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.108650 1160558 system_pods.go:61] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.108665 1160558 system_pods.go:74] duration metric: took 29.3037ms to wait for pod list to return data ...
	I1218 00:13:29.108675 1160558 default_sa.go:34] waiting for default service account to be created ...
	I1218 00:13:29.116548 1160558 default_sa.go:45] found service account: "default"
	I1218 00:13:29.116579 1160558 default_sa.go:55] duration metric: took 7.892262ms for default service account to be created ...
	I1218 00:13:29.116589 1160558 system_pods.go:116] waiting for k8s-apps to be running ...
	I1218 00:13:29.141933 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:29.142696 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:29.142730 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending
	I1218 00:13:29.142737 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending
	I1218 00:13:29.142747 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending
	I1218 00:13:29.142751 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending
	I1218 00:13:29.142756 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.142761 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.142766 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.142777 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.142781 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending
	I1218 00:13:29.142792 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.142797 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.142801 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending
	I1218 00:13:29.142814 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending
	I1218 00:13:29.142818 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending
	I1218 00:13:29.142821 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.142825 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending
	I1218 00:13:29.142829 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.142834 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.142848 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.142867 1160558 retry.go:31] will retry after 202.508545ms: missing components: kube-dns
	I1218 00:13:29.218820 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:29.388158 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:29.388199 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:29.388207 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending
	I1218 00:13:29.388215 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:29.388292 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending
	I1218 00:13:29.388308 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.388314 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.388319 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.388330 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.388337 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending
	I1218 00:13:29.388341 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.388350 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.388367 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending
	I1218 00:13:29.388379 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending
	I1218 00:13:29.388383 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending
	I1218 00:13:29.388401 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.388412 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending
	I1218 00:13:29.388417 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.388421 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.388427 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.388447 1160558 retry.go:31] will retry after 255.169438ms: missing components: kube-dns
	I1218 00:13:29.448447 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:29.473924 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:29.629343 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:29.652795 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:29.652834 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:29.652843 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:29.652851 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:29.652857 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:29.652862 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.652867 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.652871 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.652876 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.652887 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:29.652891 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.652896 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.652902 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:29.652916 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:29.652923 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:29.652934 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.652940 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:29.652944 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.652948 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.652954 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.652971 1160558 retry.go:31] will retry after 373.193664ms: missing components: kube-dns
	I1218 00:13:29.692383 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:29.934326 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:30.040866 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:30.040923 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:30.040936 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:30.040945 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:30.040953 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:30.040962 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:30.040968 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:30.040989 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:30.041002 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:30.041010 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:30.041020 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:30.041025 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:30.041034 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:30.041044 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:30.041050 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:30.041065 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:30.041073 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:30.042159 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:30.042242 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.042776 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.042791 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:30.042811 1160558 retry.go:31] will retry after 524.226404ms: missing components: kube-dns
	I1218 00:13:30.135743 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:30.191213 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:30.433634 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:30.454234 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:30.572011 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:30.572094 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:30.572121 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:30.572144 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:30.572167 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:30.572188 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:30.572208 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:30.572254 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:30.572274 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:30.572296 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:30.572315 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:30.572335 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:30.572359 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:30.572381 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:30.572403 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:30.572447 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:30.572469 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:30.572490 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.572524 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.572549 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:30.572579 1160558 retry.go:31] will retry after 553.713784ms: missing components: kube-dns
	I1218 00:13:30.622724 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:30.691185 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:30.935679 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:30.966001 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:31.122607 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:31.132313 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:31.132401 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:31.132426 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:31.132450 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:31.132472 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:31.132493 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:31.132516 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:31.132537 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:31.132557 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:31.132581 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:31.132602 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:31.132621 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:31.132645 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:31.132678 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:31.132705 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:31.132727 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:31.132757 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:31.132780 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.132802 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.132821 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:31.132859 1160558 retry.go:31] will retry after 742.024058ms: missing components: kube-dns
	I1218 00:13:31.190846 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:31.433452 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:31.454121 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:31.622470 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:31.690576 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:31.887511 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:31.887545 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Running
	I1218 00:13:31.887557 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:31.887565 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:31.887574 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:31.887580 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:31.887585 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:31.887595 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:31.887600 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:31.887612 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:31.887618 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:31.887628 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:31.887634 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:31.887641 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:31.887651 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:31.887659 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:31.887669 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:31.887676 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.887685 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.887690 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:31.887701 1160558 system_pods.go:126] duration metric: took 2.771105141s to wait for k8s-apps to be running ...
	I1218 00:13:31.887712 1160558 system_svc.go:44] waiting for kubelet service to be running ....
	I1218 00:13:31.887766 1160558 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:13:31.919555 1160558 system_svc.go:56] duration metric: took 31.833999ms WaitForService to wait for kubelet
	I1218 00:13:31.919588 1160558 kubeadm.go:587] duration metric: took 17.052669561s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:13:31.919607 1160558 node_conditions.go:102] verifying NodePressure condition ...
	I1218 00:13:31.937360 1160558 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1218 00:13:31.937392 1160558 node_conditions.go:123] node cpu capacity is 2
	I1218 00:13:31.937406 1160558 node_conditions.go:105] duration metric: took 17.79321ms to run NodePressure ...
	I1218 00:13:31.937420 1160558 start.go:242] waiting for startup goroutines ...
	I1218 00:13:31.937645 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:31.954156 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:32.121984 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:32.191384 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:32.433824 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:32.453831 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:32.623072 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:32.690627 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:32.933092 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:32.954219 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:33.122645 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:33.191332 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:33.433305 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:33.454239 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:33.622583 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:33.691034 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:33.935551 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:33.954076 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:34.124759 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:34.224147 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:34.433903 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:34.454052 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:34.623338 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:34.691404 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:34.939119 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:34.959392 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:35.123220 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:35.191743 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:35.433855 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:35.454376 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:35.622738 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:35.691374 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:35.955877 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:35.971922 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:36.122610 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:36.192124 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:36.433725 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:36.454768 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:36.629057 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:36.691635 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:36.941113 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:36.960863 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:37.121943 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:37.191761 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:37.433115 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:37.454517 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:37.622863 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:37.691066 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:37.933646 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:37.954222 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:38.122453 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:38.191090 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:38.433579 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:38.453853 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:38.621953 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:38.691504 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:38.933405 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:38.953859 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:39.122163 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:39.190415 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:39.433426 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:39.453405 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:39.623005 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:39.691918 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:39.932853 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:39.960342 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:40.123410 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:40.191066 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:40.433024 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:40.455186 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:40.622836 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:40.724262 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:40.933258 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:40.954804 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:41.122173 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:41.191042 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:41.433216 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:41.453270 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:41.621956 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:41.691717 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:41.933670 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:41.953544 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:42.124000 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:42.192128 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:42.437218 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:42.538942 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:42.621974 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:42.691644 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:42.933943 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:42.953895 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:43.122498 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:43.190677 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:43.433669 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:43.459638 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:43.621937 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:43.691427 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:43.933769 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:43.955449 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:44.126970 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:44.191469 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:44.433787 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:44.454140 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:44.622708 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:44.690887 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:44.944200 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:44.979249 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:45.125859 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:45.192372 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:45.434644 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:45.454606 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:45.621728 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:45.693216 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:45.933436 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:45.954080 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:46.122320 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:46.190559 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:46.434159 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:46.454668 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:46.622776 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:46.691473 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:46.934041 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:46.954760 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:47.126868 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:47.190712 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:47.433562 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:47.454320 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:47.622194 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:47.690455 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:47.934614 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:47.955085 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:48.123316 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:48.191254 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:48.433654 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:48.454051 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:48.622586 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:48.691393 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:48.933940 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:48.962096 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:49.122478 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:49.191177 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:49.433669 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:49.454015 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:49.622486 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:49.691442 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:49.933615 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:49.954083 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:50.122420 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:50.191176 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:50.433600 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:50.454019 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:50.624651 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:50.690933 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:50.933721 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:50.953925 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:51.128938 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:51.227035 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:51.433684 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:51.454726 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:51.624706 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:51.725417 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:51.934021 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:51.954674 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:52.121857 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:52.191161 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:52.433715 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:52.454893 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:52.623312 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:52.690774 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:52.934277 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:52.954367 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:53.122434 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:53.190763 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:53.433302 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:53.453756 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:53.621956 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:53.691969 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:53.933824 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:53.954792 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:54.122758 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:54.191389 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:54.434122 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:54.454885 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:54.622220 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:54.694574 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:54.932680 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:54.953924 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:55.122433 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:55.197688 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:55.432897 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:55.454082 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:55.622943 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:55.691713 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:55.933673 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:55.954183 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:56.129820 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:56.191488 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:56.433830 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:56.454625 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:56.622810 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:56.691001 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:56.932525 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:56.953693 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:57.122751 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:57.190759 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:57.433088 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:57.454103 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:57.622471 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:57.690420 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:57.933941 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:57.953807 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:58.128283 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:58.228067 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:58.433482 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:58.453640 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:58.621873 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:58.691753 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:58.933072 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:58.955382 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:59.123064 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:59.191654 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:59.434231 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:59.455406 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:59.622734 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:59.691079 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:59.933877 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:59.954285 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:00.125892 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:00.194465 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:00.434836 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:00.454680 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:00.621861 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:00.690909 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:00.932957 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:00.953785 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:01.122117 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:01.191108 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:01.433882 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:01.456333 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:01.622916 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:01.690785 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:01.934247 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:01.954642 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:02.121838 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:02.223048 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:02.433121 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:02.454331 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:02.622019 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:02.691089 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:02.933029 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:02.953921 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:03.122042 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:03.191848 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:03.432824 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:03.454219 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:03.622090 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:03.690414 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:03.933758 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:03.954663 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:04.122076 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:04.191176 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:04.433775 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:04.454667 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:04.622173 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:04.691959 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:04.933964 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:04.954586 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:05.122553 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:05.191511 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:05.446690 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:05.482888 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:05.622258 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:05.691158 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:05.933620 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:05.954080 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:06.122268 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:06.191264 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:06.434196 1160558 kapi.go:107] duration metric: took 45.004457535s to wait for kubernetes.io/minikube-addons=registry ...
	I1218 00:14:06.454272 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:06.622444 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:06.690590 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:06.953635 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:07.122139 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:07.192116 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:07.453252 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:07.622751 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:07.692939 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:07.955649 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:08.121854 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:08.191120 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:08.455904 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:08.622146 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:08.690801 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:08.954691 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:09.121538 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:09.191563 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:09.454288 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:09.622815 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:09.691418 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:09.953509 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:10.123796 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:10.191744 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:10.454912 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:10.622318 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:10.690979 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:10.954332 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:11.124034 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:11.190862 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:11.454635 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:11.621748 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:11.692154 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:11.954393 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:12.122819 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:12.191000 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:12.454525 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:12.622890 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:12.693726 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:12.953668 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:13.121852 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:13.190867 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:13.457814 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:13.631642 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:13.691035 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:13.956059 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:14.123179 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:14.200538 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:14.463553 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:14.622924 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:14.691548 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:14.955141 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:15.121698 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:15.190585 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:15.453519 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:15.622055 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:15.691030 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:15.973041 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:16.122162 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:16.190110 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:16.454411 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:16.622749 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:16.690919 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:16.954333 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:17.122889 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:17.191285 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:17.453995 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:17.621763 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:17.691056 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:17.953740 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:18.121713 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:18.191250 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:18.454480 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:18.622732 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:18.691442 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:18.953925 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:19.121777 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:19.190491 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:19.454892 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:19.628937 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:19.691388 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:19.954134 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:20.122241 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:20.191081 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:20.454514 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:20.626275 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:20.724777 1160558 kapi.go:107] duration metric: took 58.53766019s to wait for app.kubernetes.io/name=ingress-nginx ...
	I1218 00:14:20.954343 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:21.122344 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:21.454301 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:21.622677 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:21.957066 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:22.122819 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:22.458344 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:22.622072 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:22.954139 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:23.122475 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:23.454592 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:23.622277 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:23.954216 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:24.123266 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:24.454827 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:24.622571 1160558 kapi.go:107] duration metric: took 59.003792385s to wait for kubernetes.io/minikube-addons=gcp-auth ...
	I1218 00:14:24.627728 1160558 out.go:179] * Your GCP credentials will now be mounted into every pod created in the addons-399099 cluster.
	I1218 00:14:24.630920 1160558 out.go:179] * If you don't want your credentials mounted into a specific pod, add a label with the `gcp-auth-skip-secret` key to your pod configuration.
	I1218 00:14:24.635227 1160558 out.go:179] * If you want existing pods to be mounted with credentials, either recreate them or rerun addons enable with --refresh.
	I1218 00:14:24.954308 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:25.455353 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:25.957807 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:26.453987 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:26.953381 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:27.454023 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:27.953510 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:28.453667 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:28.954079 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:29.453688 1160558 kapi.go:107] duration metric: took 1m7.003416868s to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
	I1218 00:14:29.458297 1160558 out.go:179] * Enabled addons: inspektor-gadget, nvidia-device-plugin, cloud-spanner, storage-provisioner, registry-creds, storage-provisioner-rancher, ingress-dns, amd-gpu-device-plugin, default-storageclass, yakd, metrics-server, volumesnapshots, registry, ingress, gcp-auth, csi-hostpath-driver
	I1218 00:14:29.462025 1160558 addons.go:530] duration metric: took 1m14.593841548s for enable addons: enabled=[inspektor-gadget nvidia-device-plugin cloud-spanner storage-provisioner registry-creds storage-provisioner-rancher ingress-dns amd-gpu-device-plugin default-storageclass yakd metrics-server volumesnapshots registry ingress gcp-auth csi-hostpath-driver]
	I1218 00:14:29.462092 1160558 start.go:247] waiting for cluster config update ...
	I1218 00:14:29.462120 1160558 start.go:256] writing updated cluster config ...
	I1218 00:14:29.463147 1160558 ssh_runner.go:195] Run: rm -f paused
	I1218 00:14:29.467863 1160558 pod_ready.go:37] extra waiting up to 4m0s for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 00:14:29.471347 1160558 pod_ready.go:83] waiting for pod "coredns-66bc5c9577-clntp" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.477568 1160558 pod_ready.go:94] pod "coredns-66bc5c9577-clntp" is "Ready"
	I1218 00:14:29.477598 1160558 pod_ready.go:86] duration metric: took 6.221873ms for pod "coredns-66bc5c9577-clntp" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.479574 1160558 pod_ready.go:83] waiting for pod "etcd-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.483184 1160558 pod_ready.go:94] pod "etcd-addons-399099" is "Ready"
	I1218 00:14:29.483202 1160558 pod_ready.go:86] duration metric: took 3.605513ms for pod "etcd-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.484906 1160558 pod_ready.go:83] waiting for pod "kube-apiserver-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.488429 1160558 pod_ready.go:94] pod "kube-apiserver-addons-399099" is "Ready"
	I1218 00:14:29.488485 1160558 pod_ready.go:86] duration metric: took 3.554356ms for pod "kube-apiserver-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.490706 1160558 pod_ready.go:83] waiting for pod "kube-controller-manager-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.871876 1160558 pod_ready.go:94] pod "kube-controller-manager-addons-399099" is "Ready"
	I1218 00:14:29.871933 1160558 pod_ready.go:86] duration metric: took 381.203313ms for pod "kube-controller-manager-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:30.073311 1160558 pod_ready.go:83] waiting for pod "kube-proxy-7lfkl" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:30.472482 1160558 pod_ready.go:94] pod "kube-proxy-7lfkl" is "Ready"
	I1218 00:14:30.472510 1160558 pod_ready.go:86] duration metric: took 399.167226ms for pod "kube-proxy-7lfkl" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:30.673789 1160558 pod_ready.go:83] waiting for pod "kube-scheduler-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:31.072659 1160558 pod_ready.go:94] pod "kube-scheduler-addons-399099" is "Ready"
	I1218 00:14:31.072687 1160558 pod_ready.go:86] duration metric: took 398.870219ms for pod "kube-scheduler-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:31.072701 1160558 pod_ready.go:40] duration metric: took 1.604802478s for extra waiting for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 00:14:31.137906 1160558 start.go:625] kubectl: 1.33.2, cluster: 1.34.3 (minor skew: 1)
	I1218 00:14:31.141026 1160558 out.go:179] * Done! kubectl is now configured to use "addons-399099" cluster and "default" namespace by default
	
	
	==> CRI-O <==
	Dec 18 00:17:22 addons-399099 crio[827]: time="2025-12-18T00:17:22.027872698Z" level=info msg="Removed container ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6: kube-system/registry-creds-764b6fb674-txh6b/registry-creds" id=d34819af-74eb-44a4-b4d3-55b231854ed2 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.235552217Z" level=info msg="Running pod sandbox: default/hello-world-app-5d498dc89-4qqlr/POD" id=b6e5a026-1e4c-4006-8ea4-b79935457853 name=/runtime.v1.RuntimeService/RunPodSandbox
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.235615919Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.255669281Z" level=info msg="Got pod network &{Name:hello-world-app-5d498dc89-4qqlr Namespace:default ID:27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f UID:beb0d2c7-9bbd-473d-abaa-8eed401ad976 NetNS:/var/run/netns/90a40def-ca35-4e30-9410-893b4074d151 Networks:[{Name:kindnet Ifname:eth0}] RuntimeConfig:map[kindnet:{IP: MAC: PortMappings:[] Bandwidth:<nil> IpRanges:[] CgroupPath: PodAnnotations:0x400012a598}] Aliases:map[]}"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.255854177Z" level=info msg="Adding pod default_hello-world-app-5d498dc89-4qqlr to CNI network \"kindnet\" (type=ptp)"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.269257627Z" level=info msg="Got pod network &{Name:hello-world-app-5d498dc89-4qqlr Namespace:default ID:27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f UID:beb0d2c7-9bbd-473d-abaa-8eed401ad976 NetNS:/var/run/netns/90a40def-ca35-4e30-9410-893b4074d151 Networks:[{Name:kindnet Ifname:eth0}] RuntimeConfig:map[kindnet:{IP: MAC: PortMappings:[] Bandwidth:<nil> IpRanges:[] CgroupPath: PodAnnotations:0x400012a598}] Aliases:map[]}"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.272128571Z" level=info msg="Checking pod default_hello-world-app-5d498dc89-4qqlr for CNI network kindnet (type=ptp)"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.276931463Z" level=info msg="Ran pod sandbox 27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f with infra container: default/hello-world-app-5d498dc89-4qqlr/POD" id=b6e5a026-1e4c-4006-8ea4-b79935457853 name=/runtime.v1.RuntimeService/RunPodSandbox
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.278358862Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:1.0" id=068ee669-524d-451f-9f28-58958b469c7b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.278584528Z" level=info msg="Image docker.io/kicbase/echo-server:1.0 not found" id=068ee669-524d-451f-9f28-58958b469c7b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.278692512Z" level=info msg="Neither image nor artfiact docker.io/kicbase/echo-server:1.0 found" id=068ee669-524d-451f-9f28-58958b469c7b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.28063013Z" level=info msg="Pulling image: docker.io/kicbase/echo-server:1.0" id=86aeedbe-7314-46ee-b1da-8552c85e9dae name=/runtime.v1.ImageService/PullImage
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.287718714Z" level=info msg="Trying to access \"docker.io/kicbase/echo-server:1.0\""
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.891836527Z" level=info msg="Pulled image: docker.io/kicbase/echo-server@sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b" id=86aeedbe-7314-46ee-b1da-8552c85e9dae name=/runtime.v1.ImageService/PullImage
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.892450021Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:1.0" id=f753f6a8-c7bf-4ff2-a0d7-51225b769156 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.894602663Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:1.0" id=228a4b81-b070-461c-b5f3-9d904245ebab name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.900270568Z" level=info msg="Creating container: default/hello-world-app-5d498dc89-4qqlr/hello-world-app" id=a202801d-2650-4009-810d-6463d4602bad name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.900383918Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.90876548Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.908952379Z" level=warning msg="Failed to open /etc/passwd: open /var/lib/containers/storage/overlay/6eb2b81bd8823786fb7104b3ecb4e4043b2af1ec20420f0e6c152a9105cab205/merged/etc/passwd: no such file or directory"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.908973383Z" level=warning msg="Failed to open /etc/group: open /var/lib/containers/storage/overlay/6eb2b81bd8823786fb7104b3ecb4e4043b2af1ec20420f0e6c152a9105cab205/merged/etc/group: no such file or directory"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.909204251Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.926010927Z" level=info msg="Created container a700c0e5a0471bd7d0cb61677969770a0ed9a23e4422bc0c0e0baa81caebf45c: default/hello-world-app-5d498dc89-4qqlr/hello-world-app" id=a202801d-2650-4009-810d-6463d4602bad name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.929396103Z" level=info msg="Starting container: a700c0e5a0471bd7d0cb61677969770a0ed9a23e4422bc0c0e0baa81caebf45c" id=a164cb1b-4111-453a-9171-920fd474e8e8 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 00:17:30 addons-399099 crio[827]: time="2025-12-18T00:17:30.933516827Z" level=info msg="Started container" PID=7129 containerID=a700c0e5a0471bd7d0cb61677969770a0ed9a23e4422bc0c0e0baa81caebf45c description=default/hello-world-app-5d498dc89-4qqlr/hello-world-app id=a164cb1b-4111-453a-9171-920fd474e8e8 name=/runtime.v1.RuntimeService/StartContainer sandboxID=27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                                                        CREATED                  STATE               NAME                                     ATTEMPT             POD ID              POD                                         NAMESPACE
	a700c0e5a0471       docker.io/kicbase/echo-server@sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b                                        Less than a second ago   Running             hello-world-app                          0                   27e59bd442800       hello-world-app-5d498dc89-4qqlr             default
	b5deac7a2ade5       a2fd0654e5baeec8de2209bfade13a0034e942e708fd2bbfce69bb26a3c02e14                                                                             10 seconds ago           Exited              registry-creds                           4                   49da00b84bf96       registry-creds-764b6fb674-txh6b             kube-system
	27f16817ed7de       10afed3caf3eed1b711b8fa0a9600a7b488a45653a15a598a47ac570c1204cc4                                                                             2 minutes ago            Running             nginx                                    0                   a4dfe59afa71e       nginx                                       default
	a475cdf1ea4c5       gcr.io/k8s-minikube/busybox@sha256:580b0aa58b210f512f818b7b7ef4f63c803f7a8cd6baf571b1462b79f7b7719e                                          2 minutes ago            Running             busybox                                  0                   b459bbe53bb81       busybox                                     default
	7b50af57b1e25       registry.k8s.io/sig-storage/csi-snapshotter@sha256:bd6b8417b2a83e66ab1d4c1193bb2774f027745bdebbd9e0c1a6518afdecc39a                          3 minutes ago            Running             csi-snapshotter                          0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	1d0c5089b631a       registry.k8s.io/sig-storage/csi-provisioner@sha256:98ffd09c0784203d200e0f8c241501de31c8df79644caac7eed61bd6391e5d49                          3 minutes ago            Running             csi-provisioner                          0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	479b1b4e720d7       registry.k8s.io/sig-storage/livenessprobe@sha256:8b00c6e8f52639ed9c6f866085893ab688e57879741b3089e3cfa9998502e158                            3 minutes ago            Running             liveness-probe                           0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	af76677047f8e       registry.k8s.io/sig-storage/hostpathplugin@sha256:7b1dfc90a367222067fc468442fdf952e20fc5961f25c1ad654300ddc34d7083                           3 minutes ago            Running             hostpath                                 0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	4016d25b4ea6b       gcr.io/k8s-minikube/gcp-auth-webhook@sha256:2de98fa4b397f92e5e8e05d73caf21787a1c72c41378f3eb7bad72b1e0f4e9ff                                 3 minutes ago            Running             gcp-auth                                 0                   8047cb3ffdd9b       gcp-auth-78565c9fb4-jgkjz                   gcp-auth
	fc97664c5e67e       registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:511b8c8ac828194a753909d26555ff08bc12f497dd8daeb83fe9d593693a26c1                3 minutes ago            Running             node-driver-registrar                    0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	2ab50f9580894       registry.k8s.io/ingress-nginx/controller@sha256:75494e2145fbebf362d24e24e9285b7fbb7da8783ab272092e3126e24ee4776d                             3 minutes ago            Running             controller                               0                   24cdcae5fadc0       ingress-nginx-controller-85d4c799dd-hgt9x   ingress-nginx
	5a8d1e1ff0940       ghcr.io/inspektor-gadget/inspektor-gadget@sha256:fadc7bf59b69965b6707edb68022bed4f55a1f99b15f7acd272793e48f171496                            3 minutes ago            Running             gadget                                   0                   1f73995292cb1       gadget-rc6dl                                gadget
	0da3f638ac932       docker.io/marcnuri/yakd@sha256:0b7e831df7fe4ad1c8c56a736a8d66bd86e243f6777d3c512ead47199d8fbe1a                                              3 minutes ago            Running             yakd                                     0                   9d59b762d031d       yakd-dashboard-6654c87f9b-rbxvh             yakd-dashboard
	6820a02fa77b5       gcr.io/k8s-minikube/kube-registry-proxy@sha256:26c84a64530a67aa4d749dd4356d67ea27a2576e4d25b640d21857b0574cfd4b                              3 minutes ago            Running             registry-proxy                           0                   108a5fb8aa94d       registry-proxy-p5q9s                        kube-system
	ba89b430a4884       registry.k8s.io/sig-storage/snapshot-controller@sha256:5d668e35c15df6e87e2530da25d557f543182cedbdb39d421b87076463ee9857                      3 minutes ago            Running             volume-snapshot-controller               0                   1aa98e2aa8444       snapshot-controller-7d9fbc56b8-kzs8c        kube-system
	33876a37e66b9       nvcr.io/nvidia/k8s-device-plugin@sha256:10b7b747520ba2314061b5b319d3b2766b9cec1fd9404109c607e85b30af6905                                     3 minutes ago            Running             nvidia-device-plugin-ctr                 0                   0f78198a04813       nvidia-device-plugin-daemonset-d4dsb        kube-system
	6e71edf4ac25c       docker.io/library/registry@sha256:8715992817b2254fe61e74ffc6a4096d57a0cde36c95ea075676c05f7a94a630                                           3 minutes ago            Running             registry                                 0                   11ec0792611a6       registry-6b586f9694-k4nhf                   kube-system
	1678bbcf03188       docker.io/rancher/local-path-provisioner@sha256:689a2489a24e74426e4a4666e611c988202c5fa995908b0c60133aca3eb87d98                             3 minutes ago            Running             local-path-provisioner                   0                   4c7096b33ef9f       local-path-provisioner-648f6765c9-xht7b     local-path-storage
	efe15f55db413       registry.k8s.io/sig-storage/snapshot-controller@sha256:5d668e35c15df6e87e2530da25d557f543182cedbdb39d421b87076463ee9857                      3 minutes ago            Running             volume-snapshot-controller               0                   561754ea61f80       snapshot-controller-7d9fbc56b8-knbsf        kube-system
	0f267b721e3e3       registry.k8s.io/sig-storage/csi-external-health-monitor-controller@sha256:8b9df00898ded1bfb4d8f3672679f29cd9f88e651b76fef64121c8d347dd12c0   3 minutes ago            Running             csi-external-health-monitor-controller   0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	6a785405de376       docker.io/kicbase/minikube-ingress-dns@sha256:6d710af680d8a9b5a5b1f9047eb83ee4c9258efd3fcd962f938c00bcbb4c5958                               3 minutes ago            Running             minikube-ingress-dns                     0                   c061c9262ab31       kube-ingress-dns-minikube                   kube-system
	715597a141e6d       e8105550077f5c6c8e92536651451107053f0e41635396ee42aef596441c179a                                                                             3 minutes ago            Exited              patch                                    2                   249592722b5c1       ingress-nginx-admission-patch-b9f7w         ingress-nginx
	ec3f883321902       registry.k8s.io/sig-storage/csi-attacher@sha256:4b5609c78455de45821910065281a368d5f760b41250f90cbde5110543bdc326                             3 minutes ago            Running             csi-attacher                             0                   8e9004760be49       csi-hostpath-attacher-0                     kube-system
	1ebed049ac583       registry.k8s.io/ingress-nginx/kube-webhook-certgen@sha256:c9c1ef89e4bb9d6c9c6c0b5375c3253a0b951e5b731240be20cebe5593de142d                   3 minutes ago            Exited              create                                   0                   5fa335e974b4f       ingress-nginx-admission-create-8kc27        ingress-nginx
	fef85e094a52c       registry.k8s.io/sig-storage/csi-resizer@sha256:82c1945463342884c05a5b2bc31319712ce75b154c279c2a10765f61e0f688af                              3 minutes ago            Running             csi-resizer                              0                   22bf487459011       csi-hostpath-resizer-0                      kube-system
	ad634b2fc34b9       gcr.io/cloud-spanner-emulator/emulator@sha256:daeab9cb1978e02113045625e2633619f465f22aac7638101995f4cd03607170                               3 minutes ago            Running             cloud-spanner-emulator                   0                   8137998debf9b       cloud-spanner-emulator-5bdddb765-vw8bw      default
	c220c6e5aa941       registry.k8s.io/metrics-server/metrics-server@sha256:8f49cf1b0688bb0eae18437882dbf6de2c7a2baac71b1492bc4eca25439a1bf2                        3 minutes ago            Running             metrics-server                           0                   58ef667b2f113       metrics-server-85b7d694d7-b7rjb             kube-system
	6b35840df9ffc       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc                                                                             4 minutes ago            Running             coredns                                  0                   cbdbd6ad2d4d8       coredns-66bc5c9577-clntp                    kube-system
	63ec289de4c73       ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6                                                                             4 minutes ago            Running             storage-provisioner                      0                   1fd9516c640fa       storage-provisioner                         kube-system
	59f8baffb7c55       docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3                                           4 minutes ago            Running             kindnet-cni                              0                   ed25cae18d206       kindnet-gxdvh                               kube-system
	4f3b59d54925a       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162                                                                             4 minutes ago            Running             kube-proxy                               0                   9dff0d23f092e       kube-proxy-7lfkl                            kube-system
	b53c70f983ca4       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42                                                                             4 minutes ago            Running             etcd                                     0                   ebe8cef45c851       etcd-addons-399099                          kube-system
	6cf23e9a9796c       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22                                                                             4 minutes ago            Running             kube-controller-manager                  0                   01dd6d80f137f       kube-controller-manager-addons-399099       kube-system
	c3a1af0798174       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896                                                                             4 minutes ago            Running             kube-apiserver                           0                   0a1691408e280       kube-apiserver-addons-399099                kube-system
	067ca66f2fd9c       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6                                                                             4 minutes ago            Running             kube-scheduler                           0                   e7e713568af7b       kube-scheduler-addons-399099                kube-system
	
	
	==> coredns [6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c] <==
	[INFO] 10.244.0.17:44013 - 48017 "AAAA IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 94 false 1232" NXDOMAIN qr,rd,ra 83 0.001837262s
	[INFO] 10.244.0.17:44013 - 3621 "AAAA IN registry.kube-system.svc.cluster.local. udp 67 false 1232" NOERROR qr,aa,rd 149 0.000542734s
	[INFO] 10.244.0.17:44013 - 41331 "A IN registry.kube-system.svc.cluster.local. udp 67 false 1232" NOERROR qr,aa,rd 110 0.000548863s
	[INFO] 10.244.0.17:39118 - 5435 "AAAA IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000155908s
	[INFO] 10.244.0.17:39118 - 5221 "A IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000272368s
	[INFO] 10.244.0.17:40435 - 43143 "AAAA IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.000111659s
	[INFO] 10.244.0.17:40435 - 42938 "A IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.00021043s
	[INFO] 10.244.0.17:51322 - 56623 "AAAA IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000133115s
	[INFO] 10.244.0.17:51322 - 56451 "A IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000153053s
	[INFO] 10.244.0.17:49830 - 2502 "A IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 83 false 512" NXDOMAIN qr,rd,ra 83 0.001223072s
	[INFO] 10.244.0.17:49830 - 2715 "AAAA IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 83 false 512" NXDOMAIN qr,rd,ra 83 0.00130288s
	[INFO] 10.244.0.17:56035 - 13130 "AAAA IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 149 0.000121735s
	[INFO] 10.244.0.17:56035 - 12949 "A IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 110 0.000070365s
	[INFO] 10.244.0.21:59226 - 37541 "AAAA IN storage.googleapis.com.gcp-auth.svc.cluster.local. udp 78 false 1232" NXDOMAIN qr,aa,rd 160 0.000175181s
	[INFO] 10.244.0.21:45830 - 16842 "A IN storage.googleapis.com.gcp-auth.svc.cluster.local. udp 78 false 1232" NXDOMAIN qr,aa,rd 160 0.000154333s
	[INFO] 10.244.0.21:34631 - 65485 "A IN storage.googleapis.com.svc.cluster.local. udp 69 false 1232" NXDOMAIN qr,aa,rd 151 0.000248574s
	[INFO] 10.244.0.21:51657 - 49360 "AAAA IN storage.googleapis.com.svc.cluster.local. udp 69 false 1232" NXDOMAIN qr,aa,rd 151 0.00030842s
	[INFO] 10.244.0.21:53843 - 17208 "A IN storage.googleapis.com.cluster.local. udp 65 false 1232" NXDOMAIN qr,aa,rd 147 0.000160569s
	[INFO] 10.244.0.21:60524 - 36001 "AAAA IN storage.googleapis.com.cluster.local. udp 65 false 1232" NXDOMAIN qr,aa,rd 147 0.000061003s
	[INFO] 10.244.0.21:34029 - 55791 "AAAA IN storage.googleapis.com.us-east-2.compute.internal. udp 78 false 1232" NXDOMAIN qr,rd,ra 67 0.002212027s
	[INFO] 10.244.0.21:43457 - 26220 "A IN storage.googleapis.com.us-east-2.compute.internal. udp 78 false 1232" NXDOMAIN qr,rd,ra 67 0.002967258s
	[INFO] 10.244.0.21:52512 - 5671 "A IN storage.googleapis.com. udp 51 false 1232" NOERROR qr,rd,ra 648 0.002945072s
	[INFO] 10.244.0.21:45038 - 9431 "AAAA IN storage.googleapis.com. udp 51 false 1232" NOERROR qr,rd,ra 240 0.003147362s
	[INFO] 10.244.0.23:45594 - 2 "AAAA IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 149 0.000241666s
	[INFO] 10.244.0.23:59041 - 3 "A IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 110 0.000158115s
	
	
	==> describe nodes <==
	Name:               addons-399099
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=addons-399099
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=2e96f676eb7e96389e85fe0658a4ede4c4ba6924
	                    minikube.k8s.io/name=addons-399099
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_12_18T00_13_10_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	                    topology.hostpath.csi/node=addons-399099
	Annotations:        csi.volume.kubernetes.io/nodeid: {"hostpath.csi.k8s.io":"addons-399099"}
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 18 Dec 2025 00:13:07 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  addons-399099
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 18 Dec 2025 00:17:26 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 18 Dec 2025 00:16:14 +0000   Thu, 18 Dec 2025 00:13:03 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 18 Dec 2025 00:16:14 +0000   Thu, 18 Dec 2025 00:13:03 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 18 Dec 2025 00:16:14 +0000   Thu, 18 Dec 2025 00:13:03 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 18 Dec 2025 00:16:14 +0000   Thu, 18 Dec 2025 00:13:28 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    addons-399099
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	System Info:
	  Machine ID:                 02ff784b806e34735a6e229a69428228
	  System UUID:                001279d3-ffb2-40f8-9d24-edba87ec0224
	  Boot ID:                    57207cc2-434a-4297-a7b8-47b6fa2e7487
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  cri-o://1.34.3
	  Kubelet Version:            v1.34.3
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (28 in total)
	  Namespace                   Name                                         CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                         ------------  ----------  ---------------  -------------  ---
	  default                     busybox                                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m
	  default                     cloud-spanner-emulator-5bdddb765-vw8bw       0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m12s
	  default                     hello-world-app-5d498dc89-4qqlr              0 (0%)        0 (0%)      0 (0%)           0 (0%)         2s
	  default                     nginx                                        0 (0%)        0 (0%)      0 (0%)           0 (0%)         2m21s
	  gadget                      gadget-rc6dl                                 0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m11s
	  gcp-auth                    gcp-auth-78565c9fb4-jgkjz                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m6s
	  ingress-nginx               ingress-nginx-controller-85d4c799dd-hgt9x    100m (5%)     0 (0%)      90Mi (1%)        0 (0%)         4m9s
	  kube-system                 coredns-66bc5c9577-clntp                     100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     4m16s
	  kube-system                 csi-hostpath-attacher-0                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m9s
	  kube-system                 csi-hostpath-resizer-0                       0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m9s
	  kube-system                 csi-hostpathplugin-5v2nz                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m3s
	  kube-system                 etcd-addons-399099                           100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         4m24s
	  kube-system                 kindnet-gxdvh                                100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      4m17s
	  kube-system                 kube-apiserver-addons-399099                 250m (12%)    0 (0%)      0 (0%)           0 (0%)         4m22s
	  kube-system                 kube-controller-manager-addons-399099        200m (10%)    0 (0%)      0 (0%)           0 (0%)         4m22s
	  kube-system                 kube-ingress-dns-minikube                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m10s
	  kube-system                 kube-proxy-7lfkl                             0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m17s
	  kube-system                 kube-scheduler-addons-399099                 100m (5%)     0 (0%)      0 (0%)           0 (0%)         4m22s
	  kube-system                 metrics-server-85b7d694d7-b7rjb              100m (5%)     0 (0%)      200Mi (2%)       0 (0%)         4m10s
	  kube-system                 nvidia-device-plugin-daemonset-d4dsb         0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m2s
	  kube-system                 registry-6b586f9694-k4nhf                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m11s
	  kube-system                 registry-creds-764b6fb674-txh6b              0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m13s
	  kube-system                 registry-proxy-p5q9s                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m3s
	  kube-system                 snapshot-controller-7d9fbc56b8-knbsf         0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m10s
	  kube-system                 snapshot-controller-7d9fbc56b8-kzs8c         0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m10s
	  kube-system                 storage-provisioner                          0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m11s
	  local-path-storage          local-path-provisioner-648f6765c9-xht7b      0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m11s
	  yakd-dashboard              yakd-dashboard-6654c87f9b-rbxvh              0 (0%)        0 (0%)      128Mi (1%)       256Mi (3%)     4m10s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                1050m (52%)  100m (5%)
	  memory             638Mi (8%)   476Mi (6%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-1Gi      0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	  hugepages-32Mi     0 (0%)       0 (0%)
	  hugepages-64Ki     0 (0%)       0 (0%)
	Events:
	  Type     Reason                   Age    From             Message
	  ----     ------                   ----   ----             -------
	  Normal   Starting                 4m15s  kube-proxy       
	  Normal   Starting                 4m22s  kubelet          Starting kubelet.
	  Warning  CgroupV1                 4m22s  kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  4m22s  kubelet          Node addons-399099 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    4m22s  kubelet          Node addons-399099 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     4m22s  kubelet          Node addons-399099 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           4m17s  node-controller  Node addons-399099 event: Registered Node addons-399099 in Controller
	  Normal   NodeReady                4m3s   kubelet          Node addons-399099 status is now: NodeReady
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762] <==
	{"level":"warn","ts":"2025-12-18T00:13:05.595814Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56198","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.620084Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56220","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.631231Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56238","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.657239Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56254","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.688764Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56280","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.704881Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56300","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.740979Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56310","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.766452Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56336","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.796819Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56346","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.823007Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56364","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.853456Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56392","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.880477Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56422","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.900387Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56438","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.915910Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56454","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.946343Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56462","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.964522Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56474","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.982593Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56504","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:06.002322Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56514","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:06.119430Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56534","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:22.666987Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:37604","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:22.683274Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:37634","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.046928Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45506","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.089224Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45516","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.139488Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45530","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.150437Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45550","server-name":"","error":"EOF"}
	
	
	==> gcp-auth [4016d25b4ea6bebd5e4f1f120bb8151fc38d4aba4842de30c291845fa041b242] <==
	2025/12/18 00:14:23 GCP Auth Webhook started!
	2025/12/18 00:14:31 Ready to marshal response ...
	2025/12/18 00:14:31 Ready to write response ...
	2025/12/18 00:14:31 Ready to marshal response ...
	2025/12/18 00:14:31 Ready to write response ...
	2025/12/18 00:14:31 Ready to marshal response ...
	2025/12/18 00:14:31 Ready to write response ...
	2025/12/18 00:14:53 Ready to marshal response ...
	2025/12/18 00:14:53 Ready to write response ...
	2025/12/18 00:15:02 Ready to marshal response ...
	2025/12/18 00:15:02 Ready to write response ...
	2025/12/18 00:15:10 Ready to marshal response ...
	2025/12/18 00:15:10 Ready to write response ...
	2025/12/18 00:15:27 Ready to marshal response ...
	2025/12/18 00:15:27 Ready to write response ...
	2025/12/18 00:15:35 Ready to marshal response ...
	2025/12/18 00:15:35 Ready to write response ...
	2025/12/18 00:15:35 Ready to marshal response ...
	2025/12/18 00:15:35 Ready to write response ...
	2025/12/18 00:15:43 Ready to marshal response ...
	2025/12/18 00:15:43 Ready to write response ...
	2025/12/18 00:17:29 Ready to marshal response ...
	2025/12/18 00:17:29 Ready to write response ...
	
	
	==> kernel <==
	 00:17:32 up  6:59,  0 user,  load average: 1.45, 1.60, 1.81
	Linux addons-399099 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4] <==
	I1218 00:15:28.756393       1 main.go:301] handling current node
	I1218 00:15:38.753662       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:15:38.753779       1 main.go:301] handling current node
	I1218 00:15:48.756091       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:15:48.756241       1 main.go:301] handling current node
	I1218 00:15:58.754020       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:15:58.754128       1 main.go:301] handling current node
	I1218 00:16:08.756328       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:16:08.756377       1 main.go:301] handling current node
	I1218 00:16:18.753045       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:16:18.753313       1 main.go:301] handling current node
	I1218 00:16:28.760382       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:16:28.760416       1 main.go:301] handling current node
	I1218 00:16:38.760295       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:16:38.760329       1 main.go:301] handling current node
	I1218 00:16:48.759137       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:16:48.759173       1 main.go:301] handling current node
	I1218 00:16:58.761404       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:16:58.761439       1 main.go:301] handling current node
	I1218 00:17:08.760289       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:17:08.760321       1 main.go:301] handling current node
	I1218 00:17:18.754726       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:17:18.754847       1 main.go:301] handling current node
	I1218 00:17:28.756642       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:17:28.756744       1 main.go:301] handling current node
	
	
	==> kube-apiserver [c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af] <==
	E1218 00:13:28.895876       1 dispatcher.go:214] "Unhandled Error" err="failed calling webhook \"gcp-auth-mutate.k8s.io\": failed to call webhook: Post \"https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s\": dial tcp 10.109.186.134:443: connect: connection refused" logger="UnhandledError"
	W1218 00:13:29.009990       1 dispatcher.go:210] Failed calling webhook, failing open gcp-auth-mutate.k8s.io: failed calling webhook "gcp-auth-mutate.k8s.io": failed to call webhook: Post "https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s": dial tcp 10.109.186.134:443: connect: connection refused
	E1218 00:13:29.010087       1 dispatcher.go:214] "Unhandled Error" err="failed calling webhook \"gcp-auth-mutate.k8s.io\": failed to call webhook: Post \"https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s\": dial tcp 10.109.186.134:443: connect: connection refused" logger="UnhandledError"
	W1218 00:13:44.040480       1 logging.go:55] [core] [Channel #270 SubChannel #271]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1218 00:13:44.080400       1 logging.go:55] [core] [Channel #274 SubChannel #275]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1218 00:13:44.113340       1 logging.go:55] [core] [Channel #278 SubChannel #279]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W1218 00:13:44.143187       1 logging.go:55] [core] [Channel #282 SubChannel #283]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1218 00:13:45.009582       1 handler_proxy.go:99] no RequestInfo found in the context
	E1218 00:13:45.009685       1 controller.go:146] "Unhandled Error" err=<
		Error updating APIService "v1beta1.metrics.k8s.io" with err: failed to download v1beta1.metrics.k8s.io: failed to retrieve openAPI spec, http error: ResponseCode: 503, Body: service unavailable
		, Header: map[Content-Type:[text/plain; charset=utf-8] X-Content-Type-Options:[nosniff]]
	 > logger="UnhandledError"
	E1218 00:13:45.010463       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.013895       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.044213       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.078428       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.119428       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	I1218 00:13:45.352108       1 handler.go:285] Adding GroupVersion metrics.k8s.io v1beta1 to ResourceManager
	E1218 00:14:40.215480       1 conn.go:339] Error on socket receive: read tcp 192.168.49.2:8443->192.168.49.1:51234: use of closed network connection
	E1218 00:14:40.432396       1 conn.go:339] Error on socket receive: read tcp 192.168.49.2:8443->192.168.49.1:51266: use of closed network connection
	I1218 00:15:09.979409       1 controller.go:667] quota admission added evaluator for: ingresses.networking.k8s.io
	I1218 00:15:10.272866       1 alloc.go:328] "allocated clusterIPs" service="default/nginx" clusterIPs={"IPv4":"10.106.6.36"}
	I1218 00:15:10.751463       1 controller.go:667] quota admission added evaluator for: volumesnapshots.snapshot.storage.k8s.io
	E1218 00:15:33.576060       1 watch.go:272] "Unhandled Error" err="http2: stream closed" logger="UnhandledError"
	I1218 00:17:30.077304       1 alloc.go:328] "allocated clusterIPs" service="default/hello-world-app" clusterIPs={"IPv4":"10.102.24.196"}
	
	
	==> kube-controller-manager [6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4] <==
	I1218 00:13:14.051193       1 shared_informer.go:356] "Caches are synced" controller="attach detach"
	I1218 00:13:14.058655       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1218 00:13:14.062518       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrapproving"
	I1218 00:13:14.062618       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 00:13:14.062653       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1218 00:13:14.062682       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1218 00:13:14.064556       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1218 00:13:14.064684       1 shared_informer.go:356] "Caches are synced" controller="expand"
	I1218 00:13:14.064927       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1218 00:13:14.065074       1 shared_informer.go:356] "Caches are synced" controller="deployment"
	I1218 00:13:14.065309       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1218 00:13:14.065960       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1218 00:13:14.067087       1 shared_informer.go:356] "Caches are synced" controller="PV protection"
	I1218 00:13:14.067479       1 shared_informer.go:356] "Caches are synced" controller="ReplicationController"
	I1218 00:13:14.067508       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1218 00:13:14.068573       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1218 00:13:14.069797       1 shared_informer.go:356] "Caches are synced" controller="validatingadmissionpolicy-status"
	I1218 00:13:29.283416       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	E1218 00:13:44.033325       1 resource_quota_controller.go:446] "Unhandled Error" err="unable to retrieve the complete list of server APIs: metrics.k8s.io/v1beta1: stale GroupVersion discovery: metrics.k8s.io/v1beta1" logger="UnhandledError"
	I1218 00:13:44.033465       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="volumesnapshots.snapshot.storage.k8s.io"
	I1218 00:13:44.033522       1 shared_informer.go:349] "Waiting for caches to sync" controller="resource quota"
	I1218 00:13:44.068337       1 garbagecollector.go:787] "failed to discover some groups" logger="garbage-collector-controller" groups="map[\"metrics.k8s.io/v1beta1\":\"stale GroupVersion discovery: metrics.k8s.io/v1beta1\"]"
	I1218 00:13:44.077137       1 shared_informer.go:349] "Waiting for caches to sync" controller="garbage collector"
	I1218 00:13:44.136336       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1218 00:13:44.177307       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	
	
	==> kube-proxy [4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98] <==
	I1218 00:13:15.996913       1 server_linux.go:53] "Using iptables proxy"
	I1218 00:13:16.066401       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1218 00:13:16.169334       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1218 00:13:16.169367       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1218 00:13:16.169434       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1218 00:13:16.221108       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1218 00:13:16.221158       1 server_linux.go:132] "Using iptables Proxier"
	I1218 00:13:16.232881       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1218 00:13:16.233195       1 server.go:527] "Version info" version="v1.34.3"
	I1218 00:13:16.233209       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 00:13:16.254630       1 config.go:200] "Starting service config controller"
	I1218 00:13:16.254652       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1218 00:13:16.254670       1 config.go:106] "Starting endpoint slice config controller"
	I1218 00:13:16.254676       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1218 00:13:16.254687       1 config.go:403] "Starting serviceCIDR config controller"
	I1218 00:13:16.254691       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1218 00:13:16.255417       1 config.go:309] "Starting node config controller"
	I1218 00:13:16.255425       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1218 00:13:16.255431       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1218 00:13:16.355642       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1218 00:13:16.355678       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1218 00:13:16.355732       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6] <==
	I1218 00:13:07.407170       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	I1218 00:13:07.407421       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 00:13:07.407481       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:13:07.413728       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E1218 00:13:07.414588       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 00:13:07.424370       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:13:07.425141       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:13:07.425337       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1218 00:13:07.425393       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:13:07.425467       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 00:13:07.425504       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:13:07.425536       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:13:07.425570       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:13:07.425602       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:13:07.425635       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 00:13:07.425685       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1218 00:13:07.426434       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:13:07.426518       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: resourceslices.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceslices\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1218 00:13:07.426590       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:13:07.426695       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:13:07.426889       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:13:07.427008       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 00:13:07.427070       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 00:13:08.401665       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1218 00:13:11.514695       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Dec 18 00:16:28 addons-399099 kubelet[1278]: I1218 00:16:28.821799    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-creds-764b6fb674-txh6b" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:16:28 addons-399099 kubelet[1278]: I1218 00:16:28.821848    1278 scope.go:117] "RemoveContainer" containerID="ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6"
	Dec 18 00:16:28 addons-399099 kubelet[1278]: E1218 00:16:28.822046    1278 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-creds\" with CrashLoopBackOff: \"back-off 40s restarting failed container=registry-creds pod=registry-creds-764b6fb674-txh6b_kube-system(df78a81d-bc8f-4646-b2bc-3b30c7fe0f44)\"" pod="kube-system/registry-creds-764b6fb674-txh6b" podUID="df78a81d-bc8f-4646-b2bc-3b30c7fe0f44"
	Dec 18 00:16:30 addons-399099 kubelet[1278]: I1218 00:16:30.638150    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-proxy-p5q9s" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:16:30 addons-399099 kubelet[1278]: I1218 00:16:30.638854    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/nvidia-device-plugin-daemonset-d4dsb" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:16:31 addons-399099 kubelet[1278]: I1218 00:16:31.638553    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-6b586f9694-k4nhf" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:16:40 addons-399099 kubelet[1278]: I1218 00:16:40.637788    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-creds-764b6fb674-txh6b" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:16:40 addons-399099 kubelet[1278]: I1218 00:16:40.637864    1278 scope.go:117] "RemoveContainer" containerID="ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6"
	Dec 18 00:16:40 addons-399099 kubelet[1278]: E1218 00:16:40.638019    1278 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-creds\" with CrashLoopBackOff: \"back-off 40s restarting failed container=registry-creds pod=registry-creds-764b6fb674-txh6b_kube-system(df78a81d-bc8f-4646-b2bc-3b30c7fe0f44)\"" pod="kube-system/registry-creds-764b6fb674-txh6b" podUID="df78a81d-bc8f-4646-b2bc-3b30c7fe0f44"
	Dec 18 00:16:55 addons-399099 kubelet[1278]: I1218 00:16:55.639185    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-creds-764b6fb674-txh6b" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:16:55 addons-399099 kubelet[1278]: I1218 00:16:55.639262    1278 scope.go:117] "RemoveContainer" containerID="ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6"
	Dec 18 00:16:55 addons-399099 kubelet[1278]: E1218 00:16:55.640009    1278 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-creds\" with CrashLoopBackOff: \"back-off 40s restarting failed container=registry-creds pod=registry-creds-764b6fb674-txh6b_kube-system(df78a81d-bc8f-4646-b2bc-3b30c7fe0f44)\"" pod="kube-system/registry-creds-764b6fb674-txh6b" podUID="df78a81d-bc8f-4646-b2bc-3b30c7fe0f44"
	Dec 18 00:17:06 addons-399099 kubelet[1278]: I1218 00:17:06.637800    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-creds-764b6fb674-txh6b" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:17:06 addons-399099 kubelet[1278]: I1218 00:17:06.638380    1278 scope.go:117] "RemoveContainer" containerID="ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6"
	Dec 18 00:17:06 addons-399099 kubelet[1278]: E1218 00:17:06.638719    1278 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-creds\" with CrashLoopBackOff: \"back-off 40s restarting failed container=registry-creds pod=registry-creds-764b6fb674-txh6b_kube-system(df78a81d-bc8f-4646-b2bc-3b30c7fe0f44)\"" pod="kube-system/registry-creds-764b6fb674-txh6b" podUID="df78a81d-bc8f-4646-b2bc-3b30c7fe0f44"
	Dec 18 00:17:21 addons-399099 kubelet[1278]: I1218 00:17:21.638639    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-creds-764b6fb674-txh6b" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:17:21 addons-399099 kubelet[1278]: I1218 00:17:21.638703    1278 scope.go:117] "RemoveContainer" containerID="ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6"
	Dec 18 00:17:22 addons-399099 kubelet[1278]: I1218 00:17:22.010194    1278 scope.go:117] "RemoveContainer" containerID="ff391dbab08b313d09c23948d7b6613aa7dfdbc3ef7c17cdb960b86d695de6f6"
	Dec 18 00:17:22 addons-399099 kubelet[1278]: I1218 00:17:22.010563    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-creds-764b6fb674-txh6b" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:17:22 addons-399099 kubelet[1278]: I1218 00:17:22.010615    1278 scope.go:117] "RemoveContainer" containerID="b5deac7a2ade515f1d527abc9de8007729421bf2904758a078a9306e12a7f570"
	Dec 18 00:17:22 addons-399099 kubelet[1278]: E1218 00:17:22.010817    1278 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-creds\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry-creds pod=registry-creds-764b6fb674-txh6b_kube-system(df78a81d-bc8f-4646-b2bc-3b30c7fe0f44)\"" pod="kube-system/registry-creds-764b6fb674-txh6b" podUID="df78a81d-bc8f-4646-b2bc-3b30c7fe0f44"
	Dec 18 00:17:30 addons-399099 kubelet[1278]: I1218 00:17:30.072151    1278 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htrh\" (UniqueName: \"kubernetes.io/projected/beb0d2c7-9bbd-473d-abaa-8eed401ad976-kube-api-access-6htrh\") pod \"hello-world-app-5d498dc89-4qqlr\" (UID: \"beb0d2c7-9bbd-473d-abaa-8eed401ad976\") " pod="default/hello-world-app-5d498dc89-4qqlr"
	Dec 18 00:17:30 addons-399099 kubelet[1278]: I1218 00:17:30.072772    1278 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"gcp-creds\" (UniqueName: \"kubernetes.io/host-path/beb0d2c7-9bbd-473d-abaa-8eed401ad976-gcp-creds\") pod \"hello-world-app-5d498dc89-4qqlr\" (UID: \"beb0d2c7-9bbd-473d-abaa-8eed401ad976\") " pod="default/hello-world-app-5d498dc89-4qqlr"
	Dec 18 00:17:30 addons-399099 kubelet[1278]: W1218 00:17:30.274314    1278 manager.go:1169] Failed to process watch event {EventType:0 Name:/docker/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/crio-27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f WatchSource:0}: Error finding container 27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f: Status 404 returned error can't find the container with id 27e59bd4428003dbc2c92e81430519af3e836a2e42629ca40e7052d89fd5ab7f
	Dec 18 00:17:31 addons-399099 kubelet[1278]: I1218 00:17:31.062521    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/hello-world-app-5d498dc89-4qqlr" podStartSLOduration=1.448378402 podStartE2EDuration="2.062502044s" podCreationTimestamp="2025-12-18 00:17:29 +0000 UTC" firstStartedPulling="2025-12-18 00:17:30.279029871 +0000 UTC m=+260.756643300" lastFinishedPulling="2025-12-18 00:17:30.893153513 +0000 UTC m=+261.370766942" observedRunningTime="2025-12-18 00:17:31.060808652 +0000 UTC m=+261.538422081" watchObservedRunningTime="2025-12-18 00:17:31.062502044 +0000 UTC m=+261.540115481"
	
	
	==> storage-provisioner [63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67] <==
	W1218 00:17:06.871149       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:08.874030       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:08.880801       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:10.885212       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:10.890007       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:12.893501       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:12.899940       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:14.908362       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:14.913042       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:16.915708       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:16.920313       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:18.923005       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:18.929384       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:20.932879       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:20.937058       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:22.939886       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:22.944451       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:24.947508       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:24.953092       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:26.957666       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:26.963850       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:28.966900       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:28.973362       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:30.978297       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:17:30.982890       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p addons-399099 -n addons-399099
helpers_test.go:270: (dbg) Run:  kubectl --context addons-399099 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:281: non-running pods: ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w
helpers_test.go:283: ======> post-mortem[TestAddons/parallel/Ingress]: describe non-running pods <======
helpers_test.go:286: (dbg) Run:  kubectl --context addons-399099 describe pod ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w
helpers_test.go:286: (dbg) Non-zero exit: kubectl --context addons-399099 describe pod ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w: exit status 1 (83.317417ms)

                                                
                                                
** stderr ** 
	Error from server (NotFound): pods "ingress-nginx-admission-create-8kc27" not found
	Error from server (NotFound): pods "ingress-nginx-admission-patch-b9f7w" not found

                                                
                                                
** /stderr **
helpers_test.go:288: kubectl --context addons-399099 describe pod ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w: exit status 1
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable ingress-dns --alsologtostderr -v=1: exit status 11 (286.108921ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:17:33.076333 1170133 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:17:33.077073 1170133 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:17:33.077090 1170133 out.go:374] Setting ErrFile to fd 2...
	I1218 00:17:33.077098 1170133 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:17:33.077472 1170133 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:17:33.077827 1170133 mustload.go:66] Loading cluster: addons-399099
	I1218 00:17:33.078464 1170133 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:17:33.078485 1170133 addons.go:622] checking whether the cluster is paused
	I1218 00:17:33.078617 1170133 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:17:33.078632 1170133 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:17:33.079554 1170133 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:17:33.096903 1170133 ssh_runner.go:195] Run: systemctl --version
	I1218 00:17:33.096973 1170133 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:17:33.116334 1170133 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:17:33.223464 1170133 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:17:33.223712 1170133 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:17:33.268569 1170133 cri.go:89] found id: "b5deac7a2ade515f1d527abc9de8007729421bf2904758a078a9306e12a7f570"
	I1218 00:17:33.268642 1170133 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:17:33.268674 1170133 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:17:33.268691 1170133 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:17:33.268724 1170133 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:17:33.268748 1170133 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:17:33.268766 1170133 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:17:33.268784 1170133 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:17:33.268812 1170133 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:17:33.268836 1170133 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:17:33.268855 1170133 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:17:33.268886 1170133 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:17:33.268908 1170133 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:17:33.268928 1170133 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:17:33.268947 1170133 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:17:33.268996 1170133 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:17:33.269034 1170133 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:17:33.269071 1170133 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:17:33.269090 1170133 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:17:33.269120 1170133 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:17:33.269144 1170133 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:17:33.269161 1170133 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:17:33.269179 1170133 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:17:33.269210 1170133 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:17:33.269231 1170133 cri.go:89] found id: ""
	I1218 00:17:33.269314 1170133 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:17:33.296628 1170133 out.go:203] 
	W1218 00:17:33.299563 1170133 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:17:33Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:17:33Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:17:33.299636 1170133 out.go:285] * 
	* 
	W1218 00:17:33.307468 1170133 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_4116e8848b7c0e6a40fa9061a5ca6da2e0eb6ead_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_4116e8848b7c0e6a40fa9061a5ca6da2e0eb6ead_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:17:33.310640 1170133 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable ingress-dns addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable ingress-dns --alsologtostderr -v=1": exit status 11
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable ingress --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable ingress --alsologtostderr -v=1: exit status 11 (284.173781ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:17:33.368001 1170245 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:17:33.368659 1170245 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:17:33.368675 1170245 out.go:374] Setting ErrFile to fd 2...
	I1218 00:17:33.368682 1170245 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:17:33.368965 1170245 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:17:33.369272 1170245 mustload.go:66] Loading cluster: addons-399099
	I1218 00:17:33.369691 1170245 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:17:33.369712 1170245 addons.go:622] checking whether the cluster is paused
	I1218 00:17:33.369834 1170245 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:17:33.369853 1170245 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:17:33.370478 1170245 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:17:33.405228 1170245 ssh_runner.go:195] Run: systemctl --version
	I1218 00:17:33.405295 1170245 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:17:33.426639 1170245 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:17:33.534785 1170245 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:17:33.534883 1170245 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:17:33.566973 1170245 cri.go:89] found id: "b5deac7a2ade515f1d527abc9de8007729421bf2904758a078a9306e12a7f570"
	I1218 00:17:33.566995 1170245 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:17:33.567000 1170245 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:17:33.567004 1170245 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:17:33.567007 1170245 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:17:33.567011 1170245 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:17:33.567014 1170245 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:17:33.567017 1170245 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:17:33.567021 1170245 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:17:33.567028 1170245 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:17:33.567031 1170245 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:17:33.567035 1170245 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:17:33.567039 1170245 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:17:33.567042 1170245 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:17:33.567046 1170245 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:17:33.567051 1170245 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:17:33.567062 1170245 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:17:33.567068 1170245 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:17:33.567071 1170245 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:17:33.567074 1170245 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:17:33.567079 1170245 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:17:33.567083 1170245 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:17:33.567086 1170245 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:17:33.567089 1170245 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:17:33.567092 1170245 cri.go:89] found id: ""
	I1218 00:17:33.567142 1170245 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:17:33.581400 1170245 out.go:203] 
	W1218 00:17:33.584346 1170245 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:17:33Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:17:33Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:17:33.584371 1170245 out.go:285] * 
	* 
	W1218 00:17:33.592114 1170245 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_62553deefc570c97f2052ef703df7b8905a654d6_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_62553deefc570c97f2052ef703df7b8905a654d6_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:17:33.594990 1170245 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable ingress addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable ingress --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/Ingress (143.96s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (6.29s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:825: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:353: "gadget-rc6dl" [5673c8fe-5371-4818-807f-300da1b10f4f] Running
addons_test.go:825: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 6.003338162s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable inspektor-gadget --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable inspektor-gadget --alsologtostderr -v=1: exit status 11 (286.437899ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:09.405991 1167779 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:09.406821 1167779 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:09.406836 1167779 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:09.406842 1167779 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:09.407093 1167779 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:09.407370 1167779 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:09.407757 1167779 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:09.407773 1167779 addons.go:622] checking whether the cluster is paused
	I1218 00:15:09.407883 1167779 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:09.407901 1167779 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:09.408465 1167779 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:09.426020 1167779 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:09.426082 1167779 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:09.444598 1167779 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:09.562770 1167779 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:09.562906 1167779 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:09.596613 1167779 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:09.596632 1167779 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:09.596637 1167779 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:09.596641 1167779 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:09.596645 1167779 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:09.596648 1167779 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:09.596651 1167779 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:09.596654 1167779 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:09.596657 1167779 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:09.596663 1167779 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:09.596667 1167779 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:09.596670 1167779 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:09.596673 1167779 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:09.596676 1167779 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:09.596679 1167779 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:09.596685 1167779 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:09.596688 1167779 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:09.596693 1167779 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:09.596696 1167779 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:09.596699 1167779 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:09.596710 1167779 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:09.596713 1167779 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:09.596716 1167779 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:09.596724 1167779 cri.go:89] found id: ""
	I1218 00:15:09.596774 1167779 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:09.616629 1167779 out.go:203] 
	W1218 00:15:09.621900 1167779 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:09Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:09Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:09.621929 1167779 out.go:285] * 
	* 
	W1218 00:15:09.629869 1167779 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_07218961934993dd21acc63caaf1aa08873c018e_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_07218961934993dd21acc63caaf1aa08873c018e_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:09.633714 1167779 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable inspektor-gadget addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable inspektor-gadget --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/InspektorGadget (6.29s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.41s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:457: metrics-server stabilized in 5.189016ms
addons_test.go:459: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:353: "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Running
addons_test.go:459: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.003820418s
addons_test.go:465: (dbg) Run:  kubectl --context addons-399099 top pods -n kube-system
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable metrics-server --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable metrics-server --alsologtostderr -v=1: exit status 11 (281.094415ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:03.135237 1167567 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:03.136132 1167567 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:03.136176 1167567 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:03.136197 1167567 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:03.136517 1167567 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:03.136914 1167567 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:03.137342 1167567 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:03.137384 1167567 addons.go:622] checking whether the cluster is paused
	I1218 00:15:03.137514 1167567 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:03.137548 1167567 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:03.138100 1167567 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:03.157288 1167567 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:03.157351 1167567 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:03.176383 1167567 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:03.282831 1167567 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:03.282952 1167567 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:03.313793 1167567 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:03.313815 1167567 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:03.313820 1167567 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:03.313836 1167567 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:03.313840 1167567 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:03.313844 1167567 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:03.313847 1167567 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:03.313850 1167567 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:03.313853 1167567 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:03.313859 1167567 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:03.313868 1167567 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:03.313871 1167567 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:03.313874 1167567 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:03.313877 1167567 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:03.313881 1167567 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:03.313886 1167567 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:03.313892 1167567 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:03.313896 1167567 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:03.313899 1167567 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:03.313902 1167567 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:03.313906 1167567 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:03.313909 1167567 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:03.313912 1167567 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:03.313915 1167567 cri.go:89] found id: ""
	I1218 00:15:03.313968 1167567 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:03.329019 1167567 out.go:203] 
	W1218 00:15:03.332028 1167567 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:03Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:03Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:03.332058 1167567 out.go:285] * 
	* 
	W1218 00:15:03.340495 1167567 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_9e377edc2b59264359e9c26f81b048e390fa608a_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_9e377edc2b59264359e9c26f81b048e390fa608a_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:03.343550 1167567 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable metrics-server addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable metrics-server --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/MetricsServer (5.41s)

                                                
                                    
x
+
TestAddons/parallel/CSI (50.25s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
I1218 00:14:44.249940 1159552 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
I1218 00:14:44.252823 1159552 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
I1218 00:14:44.252854 1159552 kapi.go:107] duration metric: took 6.663ms to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
addons_test.go:551: csi-hostpath-driver pods stabilized in 6.674372ms
addons_test.go:554: (dbg) Run:  kubectl --context addons-399099 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:559: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:564: (dbg) Run:  kubectl --context addons-399099 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:569: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:353: "task-pv-pod" [0d848891-1657-494f-8b40-d3d8d839c18b] Pending
helpers_test.go:353: "task-pv-pod" [0d848891-1657-494f-8b40-d3d8d839c18b] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:353: "task-pv-pod" [0d848891-1657-494f-8b40-d3d8d839c18b] Running
addons_test.go:569: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 8.004026751s
addons_test.go:574: (dbg) Run:  kubectl --context addons-399099 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:579: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:428: (dbg) Run:  kubectl --context addons-399099 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:428: (dbg) Run:  kubectl --context addons-399099 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:584: (dbg) Run:  kubectl --context addons-399099 delete pod task-pv-pod
addons_test.go:590: (dbg) Run:  kubectl --context addons-399099 delete pvc hpvc
addons_test.go:596: (dbg) Run:  kubectl --context addons-399099 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:601: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:606: (dbg) Run:  kubectl --context addons-399099 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:611: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:353: "task-pv-pod-restore" [1f9ac880-cd32-4e0d-adfa-bf69980c44e2] Pending
helpers_test.go:353: "task-pv-pod-restore" [1f9ac880-cd32-4e0d-adfa-bf69980c44e2] Running
addons_test.go:611: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 6.003889682s
addons_test.go:616: (dbg) Run:  kubectl --context addons-399099 delete pod task-pv-pod-restore
addons_test.go:620: (dbg) Run:  kubectl --context addons-399099 delete pvc hpvc-restore
addons_test.go:624: (dbg) Run:  kubectl --context addons-399099 delete volumesnapshot new-snapshot-demo
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable volumesnapshots --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable volumesnapshots --alsologtostderr -v=1: exit status 11 (270.505038ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:34.030423 1168436 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:34.031148 1168436 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:34.031231 1168436 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:34.031239 1168436 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:34.031553 1168436 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:34.031946 1168436 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:34.032438 1168436 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:34.032459 1168436 addons.go:622] checking whether the cluster is paused
	I1218 00:15:34.032615 1168436 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:34.032634 1168436 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:34.033200 1168436 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:34.052835 1168436 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:34.052890 1168436 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:34.071046 1168436 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:34.179181 1168436 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:34.179274 1168436 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:34.207151 1168436 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:34.207174 1168436 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:34.207180 1168436 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:34.207183 1168436 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:34.207187 1168436 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:34.207191 1168436 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:34.207195 1168436 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:34.207198 1168436 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:34.207202 1168436 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:34.207208 1168436 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:34.207212 1168436 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:34.207215 1168436 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:34.207218 1168436 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:34.207221 1168436 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:34.207227 1168436 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:34.207232 1168436 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:34.207235 1168436 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:34.207239 1168436 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:34.207242 1168436 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:34.207245 1168436 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:34.207249 1168436 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:34.207252 1168436 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:34.207255 1168436 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:34.207259 1168436 cri.go:89] found id: ""
	I1218 00:15:34.207310 1168436 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:34.222209 1168436 out.go:203] 
	W1218 00:15:34.225475 1168436 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:34Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:34Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:34.225506 1168436 out.go:285] * 
	* 
	W1218 00:15:34.233262 1168436 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_f6150db7515caf82d8c4c5baeba9fd21f738a7e0_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_f6150db7515caf82d8c4c5baeba9fd21f738a7e0_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:34.236304 1168436 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable volumesnapshots addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable volumesnapshots --alsologtostderr -v=1": exit status 11
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable csi-hostpath-driver --alsologtostderr -v=1: exit status 11 (255.196201ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:34.291345 1168480 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:34.292068 1168480 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:34.292081 1168480 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:34.292087 1168480 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:34.292372 1168480 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:34.292654 1168480 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:34.293060 1168480 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:34.293078 1168480 addons.go:622] checking whether the cluster is paused
	I1218 00:15:34.293182 1168480 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:34.293195 1168480 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:34.293691 1168480 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:34.310756 1168480 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:34.310828 1168480 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:34.328688 1168480 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:34.434514 1168480 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:34.434602 1168480 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:34.462850 1168480 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:34.462873 1168480 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:34.462878 1168480 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:34.462882 1168480 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:34.462885 1168480 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:34.462889 1168480 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:34.462892 1168480 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:34.462895 1168480 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:34.462897 1168480 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:34.462907 1168480 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:34.462910 1168480 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:34.462914 1168480 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:34.462917 1168480 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:34.462920 1168480 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:34.462923 1168480 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:34.462932 1168480 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:34.462938 1168480 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:34.462943 1168480 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:34.462946 1168480 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:34.462949 1168480 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:34.462954 1168480 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:34.462960 1168480 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:34.462963 1168480 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:34.462966 1168480 cri.go:89] found id: ""
	I1218 00:15:34.463018 1168480 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:34.477082 1168480 out.go:203] 
	W1218 00:15:34.480064 1168480 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:34Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:34Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:34.480085 1168480 out.go:285] * 
	* 
	W1218 00:15:34.487746 1168480 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_913eef9b964ccef8b5b536327192b81f4aff5da9_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_913eef9b964ccef8b5b536327192b81f4aff5da9_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:34.490692 1168480 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable csi-hostpath-driver addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable csi-hostpath-driver --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/CSI (50.25s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (3.4s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:810: (dbg) Run:  out/minikube-linux-arm64 addons enable headlamp -p addons-399099 --alsologtostderr -v=1
addons_test.go:810: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable headlamp -p addons-399099 --alsologtostderr -v=1: exit status 11 (275.770109ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:14:40.907461 1166705 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:14:40.908292 1166705 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:40.908333 1166705 out.go:374] Setting ErrFile to fd 2...
	I1218 00:14:40.908356 1166705 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:40.908621 1166705 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:14:40.909162 1166705 mustload.go:66] Loading cluster: addons-399099
	I1218 00:14:40.910223 1166705 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:40.910281 1166705 addons.go:622] checking whether the cluster is paused
	I1218 00:14:40.910451 1166705 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:40.910490 1166705 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:14:40.911227 1166705 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:14:40.931534 1166705 ssh_runner.go:195] Run: systemctl --version
	I1218 00:14:40.931593 1166705 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:14:40.949981 1166705 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:14:41.054837 1166705 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:14:41.054974 1166705 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:14:41.086266 1166705 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:14:41.086288 1166705 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:14:41.086293 1166705 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:14:41.086297 1166705 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:14:41.086301 1166705 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:14:41.086305 1166705 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:14:41.086308 1166705 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:14:41.086311 1166705 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:14:41.086334 1166705 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:14:41.086348 1166705 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:14:41.086355 1166705 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:14:41.086359 1166705 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:14:41.086362 1166705 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:14:41.086365 1166705 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:14:41.086368 1166705 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:14:41.086374 1166705 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:14:41.086380 1166705 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:14:41.086385 1166705 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:14:41.086387 1166705 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:14:41.086391 1166705 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:14:41.086396 1166705 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:14:41.086413 1166705 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:14:41.086416 1166705 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:14:41.086420 1166705 cri.go:89] found id: ""
	I1218 00:14:41.086486 1166705 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:14:41.102397 1166705 out.go:203] 
	W1218 00:14:41.105267 1166705 out.go:285] X Exiting due to MK_ADDON_ENABLE_PAUSED: enabled failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:41Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_ENABLE_PAUSED: enabled failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:41Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:14:41.105314 1166705 out.go:285] * 
	* 
	W1218 00:14:41.113130 1166705 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_af3b8a9ce4f102efc219f1404c9eed7a69cbf2d5_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_af3b8a9ce4f102efc219f1404c9eed7a69cbf2d5_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:14:41.116385 1166705 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:812: failed to enable headlamp addon: args: "out/minikube-linux-arm64 addons enable headlamp -p addons-399099 --alsologtostderr -v=1": exit status 11
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestAddons/parallel/Headlamp]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestAddons/parallel/Headlamp]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect addons-399099
helpers_test.go:244: (dbg) docker inspect addons-399099:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3",
	        "Created": "2025-12-18T00:12:45.653994198Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1160961,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:12:45.715729745Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/hostname",
	        "HostsPath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/hosts",
	        "LogPath": "/var/lib/docker/containers/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3/deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3-json.log",
	        "Name": "/addons-399099",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "addons-399099:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "addons-399099",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "deedfeeb0da088accf076a89d6e19c7d8e2b278702fadb83a04aa592395c30a3",
	                "LowerDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c/merged",
	                "UpperDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c/diff",
	                "WorkDir": "/var/lib/docker/overlay2/b8f57b972f5ff9a5d213f11b343a1fa7e7ccec8349fc15a48bfb718dc8cecb4c/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "addons-399099",
	                "Source": "/var/lib/docker/volumes/addons-399099/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "addons-399099",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "addons-399099",
	                "name.minikube.sigs.k8s.io": "addons-399099",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "96bfedaa930e5289f84629700928b9067f16cc86feaf5d8687fda240201d7ae8",
	            "SandboxKey": "/var/run/docker/netns/96bfedaa930e",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33910"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33911"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33914"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33912"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33913"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "addons-399099": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "ce:64:2e:10:5f:9e",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "b6eb769954854278d027b4718b55a4f007c530da20340571549bfde161f35973",
	                    "EndpointID": "c19b4d95c5ba48737dc0c8876e34e71428f125165d4d38827642582fc8c51bfd",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "addons-399099",
	                        "deedfeeb0da0"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p addons-399099 -n addons-399099
helpers_test.go:253: <<< TestAddons/parallel/Headlamp FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestAddons/parallel/Headlamp]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p addons-399099 logs -n 25: (1.69950533s)
helpers_test.go:261: TestAddons/parallel/Headlamp logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                                                                                                                                                                   ARGS                                                                                                                                                                                                                                   │        PROFILE         │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-293945 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=crio --driver=docker  --container-runtime=crio                                                                                                                                                                                                                                                                                                │ download-only-293945   │ jenkins │ v1.37.0 │ 18 Dec 25 00:11 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                                                                                                                                                                                                                                                                                                    │ minikube               │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-293945                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ download-only-293945   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ -o=json --download-only -p download-only-782051 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=crio --driver=docker  --container-runtime=crio                                                                                                                                                                                                                                                                                                │ download-only-782051   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                                                                                                                                                                                                                                                                                                    │ minikube               │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-782051                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ download-only-782051   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ -o=json --download-only -p download-only-398668 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=crio --driver=docker  --container-runtime=crio                                                                                                                                                                                                                                                                                           │ download-only-398668   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                                                                                                                                                                                                                                                                                                    │ minikube               │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-398668                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ download-only-398668   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-293945                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ download-only-293945   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-782051                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ download-only-782051   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-398668                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ download-only-398668   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ --download-only -p download-docker-540812 --alsologtostderr --driver=docker  --container-runtime=crio                                                                                                                                                                                                                                                                                                                                                                    │ download-docker-540812 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ delete  │ -p download-docker-540812                                                                                                                                                                                                                                                                                                                                                                                                                                                │ download-docker-540812 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ --download-only -p binary-mirror-966047 --alsologtostderr --binary-mirror http://127.0.0.1:45057 --driver=docker  --container-runtime=crio                                                                                                                                                                                                                                                                                                                               │ binary-mirror-966047   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ delete  │ -p binary-mirror-966047                                                                                                                                                                                                                                                                                                                                                                                                                                                  │ binary-mirror-966047   │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ addons  │ enable dashboard -p addons-399099                                                                                                                                                                                                                                                                                                                                                                                                                                        │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ addons  │ disable dashboard -p addons-399099                                                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ start   │ -p addons-399099 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=crio --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:14 UTC │
	│ addons  │ addons-399099 addons disable volcano --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                              │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ addons  │ addons-399099 addons disable gcp-auth --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	│ addons  │ enable headlamp -p addons-399099 --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                  │ addons-399099          │ jenkins │ v1.37.0 │ 18 Dec 25 00:14 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:12:20
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:12:20.984924 1160558 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:12:20.985091 1160558 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:20.985121 1160558 out.go:374] Setting ErrFile to fd 2...
	I1218 00:12:20.985143 1160558 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:20.985408 1160558 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:12:20.985906 1160558 out.go:368] Setting JSON to false
	I1218 00:12:20.986702 1160558 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":24889,"bootTime":1765991852,"procs":145,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:12:20.986800 1160558 start.go:143] virtualization:  
	I1218 00:12:20.990266 1160558 out.go:179] * [addons-399099] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:12:20.994161 1160558 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:12:20.994217 1160558 notify.go:221] Checking for updates...
	I1218 00:12:20.999981 1160558 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:12:21.011998 1160558 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:12:21.014920 1160558 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:12:21.017882 1160558 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:12:21.020774 1160558 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:12:21.023900 1160558 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:12:21.048358 1160558 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:12:21.048496 1160558 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:21.114384 1160558 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-18 00:12:21.100124203 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:21.114502 1160558 docker.go:319] overlay module found
	I1218 00:12:21.117714 1160558 out.go:179] * Using the docker driver based on user configuration
	I1218 00:12:21.120516 1160558 start.go:309] selected driver: docker
	I1218 00:12:21.120540 1160558 start.go:927] validating driver "docker" against <nil>
	I1218 00:12:21.120554 1160558 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:12:21.121256 1160558 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:21.173103 1160558 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-18 00:12:21.164325274 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:21.173256 1160558 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 00:12:21.173491 1160558 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:12:21.176370 1160558 out.go:179] * Using Docker driver with root privileges
	I1218 00:12:21.179089 1160558 cni.go:84] Creating CNI manager for ""
	I1218 00:12:21.179158 1160558 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:12:21.179170 1160558 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 00:12:21.179249 1160558 start.go:353] cluster config:
	{Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime
:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs:
AutoPauseInterval:1m0s}
	I1218 00:12:21.182318 1160558 out.go:179] * Starting "addons-399099" primary control-plane node in "addons-399099" cluster
	I1218 00:12:21.185089 1160558 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:12:21.187981 1160558 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:12:21.190799 1160558 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:12:21.190978 1160558 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:21.191004 1160558 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:12:21.191012 1160558 cache.go:65] Caching tarball of preloaded images
	I1218 00:12:21.191088 1160558 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:12:21.191104 1160558 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:12:21.191471 1160558 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/config.json ...
	I1218 00:12:21.191499 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/config.json: {Name:mka25bf273bdc24fbc031875fcf06423ccf24563 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:21.206938 1160558 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1218 00:12:21.207083 1160558 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory
	I1218 00:12:21.207109 1160558 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory, skipping pull
	I1218 00:12:21.207114 1160558 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in cache, skipping pull
	I1218 00:12:21.207122 1160558 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 as a tarball
	I1218 00:12:21.207127 1160558 cache.go:176] Loading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 from local cache
	I1218 00:12:39.058986 1160558 cache.go:178] successfully loaded and using gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 from cached tarball
	I1218 00:12:39.059028 1160558 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:12:39.059067 1160558 start.go:360] acquireMachinesLock for addons-399099: {Name:mkf472e05bf018f075f6ec92cb001b01a2413843 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:12:39.059202 1160558 start.go:364] duration metric: took 111.406µs to acquireMachinesLock for "addons-399099"
	I1218 00:12:39.059238 1160558 start.go:93] Provisioning new machine with config: &{Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:min
ikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath:
SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:12:39.059318 1160558 start.go:125] createHost starting for "" (driver="docker")
	I1218 00:12:39.062874 1160558 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	I1218 00:12:39.063125 1160558 start.go:159] libmachine.API.Create for "addons-399099" (driver="docker")
	I1218 00:12:39.063166 1160558 client.go:173] LocalClient.Create starting
	I1218 00:12:39.063292 1160558 main.go:143] libmachine: Creating CA: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem
	I1218 00:12:39.196682 1160558 main.go:143] libmachine: Creating client certificate: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem
	I1218 00:12:39.459233 1160558 cli_runner.go:164] Run: docker network inspect addons-399099 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1218 00:12:39.475616 1160558 cli_runner.go:211] docker network inspect addons-399099 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1218 00:12:39.475719 1160558 network_create.go:284] running [docker network inspect addons-399099] to gather additional debugging logs...
	I1218 00:12:39.475744 1160558 cli_runner.go:164] Run: docker network inspect addons-399099
	W1218 00:12:39.491777 1160558 cli_runner.go:211] docker network inspect addons-399099 returned with exit code 1
	I1218 00:12:39.491810 1160558 network_create.go:287] error running [docker network inspect addons-399099]: docker network inspect addons-399099: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network addons-399099 not found
	I1218 00:12:39.491824 1160558 network_create.go:289] output of [docker network inspect addons-399099]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network addons-399099 not found
	
	** /stderr **
	I1218 00:12:39.491956 1160558 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:12:39.508539 1160558 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001983900}
	I1218 00:12:39.508584 1160558 network_create.go:124] attempt to create docker network addons-399099 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1218 00:12:39.508639 1160558 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=addons-399099 addons-399099
	I1218 00:12:39.570642 1160558 network_create.go:108] docker network addons-399099 192.168.49.0/24 created
	I1218 00:12:39.570675 1160558 kic.go:121] calculated static IP "192.168.49.2" for the "addons-399099" container
	I1218 00:12:39.570747 1160558 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1218 00:12:39.585760 1160558 cli_runner.go:164] Run: docker volume create addons-399099 --label name.minikube.sigs.k8s.io=addons-399099 --label created_by.minikube.sigs.k8s.io=true
	I1218 00:12:39.603687 1160558 oci.go:103] Successfully created a docker volume addons-399099
	I1218 00:12:39.603780 1160558 cli_runner.go:164] Run: docker run --rm --name addons-399099-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-399099 --entrypoint /usr/bin/test -v addons-399099:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib
	I1218 00:12:41.586921 1160558 cli_runner.go:217] Completed: docker run --rm --name addons-399099-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-399099 --entrypoint /usr/bin/test -v addons-399099:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib: (1.983100665s)
	I1218 00:12:41.586954 1160558 oci.go:107] Successfully prepared a docker volume addons-399099
	I1218 00:12:41.587002 1160558 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:41.587017 1160558 kic.go:194] Starting extracting preloaded images to volume ...
	I1218 00:12:41.587085 1160558 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v addons-399099:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir
	I1218 00:12:45.575570 1160558 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v addons-399099:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir: (3.988446202s)
	I1218 00:12:45.575602 1160558 kic.go:203] duration metric: took 3.988582321s to extract preloaded images to volume ...
	W1218 00:12:45.575745 1160558 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1218 00:12:45.575863 1160558 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1218 00:12:45.637562 1160558 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname addons-399099 --name addons-399099 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-399099 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=addons-399099 --network addons-399099 --ip 192.168.49.2 --volume addons-399099:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0
	I1218 00:12:45.928846 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Running}}
	I1218 00:12:45.954902 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:12:45.981614 1160558 cli_runner.go:164] Run: docker exec addons-399099 stat /var/lib/dpkg/alternatives/iptables
	I1218 00:12:46.048704 1160558 oci.go:144] the created container "addons-399099" has a running status.
	I1218 00:12:46.048736 1160558 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa...
	I1218 00:12:46.363025 1160558 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1218 00:12:46.389900 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:12:46.423930 1160558 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1218 00:12:46.423949 1160558 kic_runner.go:114] Args: [docker exec --privileged addons-399099 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1218 00:12:46.511600 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:12:46.535966 1160558 machine.go:94] provisionDockerMachine start ...
	I1218 00:12:46.536065 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:46.564128 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:46.564718 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:46.564735 1160558 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:12:46.565455 1160558 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1218 00:12:49.723694 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: addons-399099
	
	I1218 00:12:49.723719 1160558 ubuntu.go:182] provisioning hostname "addons-399099"
	I1218 00:12:49.723784 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:49.741190 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:49.741517 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:49.741533 1160558 main.go:143] libmachine: About to run SSH command:
	sudo hostname addons-399099 && echo "addons-399099" | sudo tee /etc/hostname
	I1218 00:12:49.901181 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: addons-399099
	
	I1218 00:12:49.901265 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:49.919175 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:49.919483 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:49.919498 1160558 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\saddons-399099' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 addons-399099/g' /etc/hosts;
				else 
					echo '127.0.1.1 addons-399099' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:12:50.072522 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:12:50.072548 1160558 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:12:50.072581 1160558 ubuntu.go:190] setting up certificates
	I1218 00:12:50.072601 1160558 provision.go:84] configureAuth start
	I1218 00:12:50.072675 1160558 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-399099
	I1218 00:12:50.090136 1160558 provision.go:143] copyHostCerts
	I1218 00:12:50.090228 1160558 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:12:50.090368 1160558 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:12:50.090428 1160558 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:12:50.090485 1160558 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.addons-399099 san=[127.0.0.1 192.168.49.2 addons-399099 localhost minikube]
	I1218 00:12:50.250433 1160558 provision.go:177] copyRemoteCerts
	I1218 00:12:50.250499 1160558 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:12:50.250538 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.267324 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:50.371734 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I1218 00:12:50.388330 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:12:50.405058 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:12:50.421976 1160558 provision.go:87] duration metric: took 349.345812ms to configureAuth
	I1218 00:12:50.422055 1160558 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:12:50.422272 1160558 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:12:50.422404 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.439398 1160558 main.go:143] libmachine: Using SSH client type: native
	I1218 00:12:50.439713 1160558 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33910 <nil> <nil>}
	I1218 00:12:50.439726 1160558 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:12:50.751149 1160558 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:12:50.751174 1160558 machine.go:97] duration metric: took 4.215187549s to provisionDockerMachine
	I1218 00:12:50.751186 1160558 client.go:176] duration metric: took 11.688009231s to LocalClient.Create
	I1218 00:12:50.751199 1160558 start.go:167] duration metric: took 11.688075633s to libmachine.API.Create "addons-399099"
	I1218 00:12:50.751206 1160558 start.go:293] postStartSetup for "addons-399099" (driver="docker")
	I1218 00:12:50.751216 1160558 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:12:50.751286 1160558 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:12:50.751329 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.777333 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:50.883963 1160558 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:12:50.887177 1160558 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:12:50.887204 1160558 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:12:50.887215 1160558 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:12:50.887279 1160558 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:12:50.887307 1160558 start.go:296] duration metric: took 136.094827ms for postStartSetup
	I1218 00:12:50.887628 1160558 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-399099
	I1218 00:12:50.903943 1160558 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/config.json ...
	I1218 00:12:50.904418 1160558 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:12:50.904478 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:50.920969 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:51.025589 1160558 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:12:51.030352 1160558 start.go:128] duration metric: took 11.971017128s to createHost
	I1218 00:12:51.030377 1160558 start.go:83] releasing machines lock for "addons-399099", held for 11.971162198s
	I1218 00:12:51.030447 1160558 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-399099
	I1218 00:12:51.048173 1160558 ssh_runner.go:195] Run: cat /version.json
	I1218 00:12:51.048277 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:51.048627 1160558 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:12:51.048700 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:12:51.071640 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:51.075457 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:12:51.179876 1160558 ssh_runner.go:195] Run: systemctl --version
	I1218 00:12:51.270909 1160558 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:12:51.305372 1160558 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:12:51.309672 1160558 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:12:51.309744 1160558 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:12:51.336895 1160558 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1218 00:12:51.336919 1160558 start.go:496] detecting cgroup driver to use...
	I1218 00:12:51.336952 1160558 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:12:51.337001 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:12:51.353834 1160558 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:12:51.366489 1160558 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:12:51.366556 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:12:51.384345 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:12:51.403492 1160558 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:12:51.538040 1160558 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:12:51.671453 1160558 docker.go:234] disabling docker service ...
	I1218 00:12:51.671517 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:12:51.694900 1160558 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:12:51.707489 1160558 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:12:51.841376 1160558 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:12:51.963812 1160558 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:12:51.976097 1160558 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:12:51.989106 1160558 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:12:51.989215 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:51.997295 1160558 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:12:51.997412 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.007851 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.017826 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.026949 1160558 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:12:52.036059 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.044833 1160558 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.059581 1160558 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:12:52.068949 1160558 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:12:52.077280 1160558 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:12:52.084832 1160558 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:12:52.205686 1160558 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:12:52.367991 1160558 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:12:52.368077 1160558 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:12:52.371658 1160558 start.go:564] Will wait 60s for crictl version
	I1218 00:12:52.371723 1160558 ssh_runner.go:195] Run: which crictl
	I1218 00:12:52.375119 1160558 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:12:52.401626 1160558 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:12:52.401725 1160558 ssh_runner.go:195] Run: crio --version
	I1218 00:12:52.431262 1160558 ssh_runner.go:195] Run: crio --version
	I1218 00:12:52.462900 1160558 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:12:52.465795 1160558 cli_runner.go:164] Run: docker network inspect addons-399099 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:12:52.482740 1160558 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:12:52.486480 1160558 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 00:12:52.496049 1160558 kubeadm.go:884] updating cluster {Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNa
mes:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketV
MnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:12:52.496178 1160558 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:52.496257 1160558 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:12:52.530233 1160558 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:12:52.530256 1160558 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:12:52.530311 1160558 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:12:52.553538 1160558 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:12:52.553560 1160558 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:12:52.553569 1160558 kubeadm.go:935] updating node { 192.168.49.2 8443 v1.34.3 crio true true} ...
	I1218 00:12:52.553700 1160558 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=addons-399099 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:12:52.553782 1160558 ssh_runner.go:195] Run: crio config
	I1218 00:12:52.624851 1160558 cni.go:84] Creating CNI manager for ""
	I1218 00:12:52.624871 1160558 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:12:52.624889 1160558 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:12:52.624934 1160558 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8443 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:addons-399099 NodeName:addons-399099 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kuberne
tes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:12:52.625107 1160558 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "addons-399099"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:12:52.625184 1160558 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:12:52.632518 1160558 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:12:52.632611 1160558 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:12:52.639991 1160558 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (363 bytes)
	I1218 00:12:52.652028 1160558 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:12:52.664668 1160558 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2210 bytes)
	I1218 00:12:52.676343 1160558 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:12:52.679659 1160558 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 00:12:52.688914 1160558 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:12:52.811066 1160558 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:12:52.827386 1160558 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099 for IP: 192.168.49.2
	I1218 00:12:52.827405 1160558 certs.go:195] generating shared ca certs ...
	I1218 00:12:52.827420 1160558 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:52.827548 1160558 certs.go:241] generating "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:12:53.092915 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt ...
	I1218 00:12:53.092948 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt: {Name:mk226e9dd5b352dedeaeb4a78738225ca3d6135a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.093145 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key ...
	I1218 00:12:53.093157 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key: {Name:mk671944f0854499fc6e3ec5d6820eacd490e2cf Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.093253 1160558 certs.go:241] generating "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:12:53.255736 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt ...
	I1218 00:12:53.255768 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt: {Name:mk4ade57f1513111ab4e4ce9561fbffb032cb5ab Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.255934 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key ...
	I1218 00:12:53.255950 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key: {Name:mkd33963319741682b24d6e4e71cc086455d2530 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.256027 1160558 certs.go:257] generating profile certs ...
	I1218 00:12:53.256087 1160558 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.key
	I1218 00:12:53.256105 1160558 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt with IP's: []
	I1218 00:12:53.400824 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt ...
	I1218 00:12:53.400859 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: {Name:mk352511a18cc8ba8ba10982fc22a75b8603ce38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.401023 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.key ...
	I1218 00:12:53.401041 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.key: {Name:mka05f935856a494c785edbeaf9a11144edf222c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.401118 1160558 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781
	I1218 00:12:53.401140 1160558 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1218 00:12:53.616403 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781 ...
	I1218 00:12:53.616431 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781: {Name:mka697dee47c7b6e4349ec95a83ee44198c7f8fd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.616596 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781 ...
	I1218 00:12:53.616609 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781: {Name:mk9858a1b6fc540bef1583e9a5cf3680480d63f3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.616687 1160558 certs.go:382] copying /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt.fa144781 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt
	I1218 00:12:53.616768 1160558 certs.go:386] copying /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key.fa144781 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key
	I1218 00:12:53.616822 1160558 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key
	I1218 00:12:53.616844 1160558 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt with IP's: []
	I1218 00:12:53.702402 1160558 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt ...
	I1218 00:12:53.702433 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt: {Name:mk6712bb91eec42828d26747cdd3175be72765a6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.702614 1160558 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key ...
	I1218 00:12:53.702628 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key: {Name:mk9824bd470321d26bfc0f189b1ee2d620ed19f9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:53.702823 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:12:53.702868 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:12:53.702894 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:12:53.702935 1160558 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:12:53.703493 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:12:53.722812 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:12:53.740139 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:12:53.757546 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:12:53.774411 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1218 00:12:53.791798 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:12:53.808743 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:12:53.826544 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:12:53.844509 1160558 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:12:53.861591 1160558 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:12:53.873985 1160558 ssh_runner.go:195] Run: openssl version
	I1218 00:12:53.880210 1160558 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.887475 1160558 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:12:53.894823 1160558 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.898452 1160558 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.898518 1160558 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:12:53.939206 1160558 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:12:53.946242 1160558 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1218 00:12:53.953253 1160558 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:12:53.956817 1160558 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1218 00:12:53.956868 1160558 kubeadm.go:401] StartCluster: {Name:addons-399099 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:addons-399099 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames
:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMne
tClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:12:53.956951 1160558 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:12:53.957011 1160558 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:12:53.982546 1160558 cri.go:89] found id: ""
	I1218 00:12:53.982615 1160558 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:12:53.990303 1160558 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:12:53.997965 1160558 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:12:53.998061 1160558 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:12:54.007130 1160558 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:12:54.007164 1160558 kubeadm.go:158] found existing configuration files:
	
	I1218 00:12:54.007236 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1218 00:12:54.016300 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:12:54.016440 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:12:54.024438 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1218 00:12:54.032378 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:12:54.032451 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:12:54.039970 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1218 00:12:54.048662 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:12:54.048756 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:12:54.056589 1160558 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1218 00:12:54.064522 1160558 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:12:54.064598 1160558 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:12:54.072168 1160558 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:12:54.114560 1160558 kubeadm.go:319] [init] Using Kubernetes version: v1.34.3
	I1218 00:12:54.114808 1160558 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:12:54.139197 1160558 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:12:54.139315 1160558 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:12:54.139380 1160558 kubeadm.go:319] OS: Linux
	I1218 00:12:54.139448 1160558 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:12:54.139519 1160558 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:12:54.139593 1160558 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:12:54.139670 1160558 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:12:54.139739 1160558 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:12:54.139811 1160558 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:12:54.139903 1160558 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:12:54.140000 1160558 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:12:54.140075 1160558 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:12:54.208462 1160558 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:12:54.208659 1160558 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:12:54.208804 1160558 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:12:54.215551 1160558 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:12:54.222075 1160558 out.go:252]   - Generating certificates and keys ...
	I1218 00:12:54.222167 1160558 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:12:54.222230 1160558 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:12:54.550604 1160558 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1218 00:12:54.889036 1160558 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1218 00:12:55.080946 1160558 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1218 00:12:55.261829 1160558 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1218 00:12:55.643453 1160558 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1218 00:12:55.643761 1160558 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [addons-399099 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1218 00:12:56.046339 1160558 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1218 00:12:56.046548 1160558 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [addons-399099 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1218 00:12:56.821447 1160558 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1218 00:12:58.046501 1160558 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1218 00:12:58.354674 1160558 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1218 00:12:58.354963 1160558 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:12:58.634419 1160558 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:12:59.164859 1160558 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:13:00.401909 1160558 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:13:00.623079 1160558 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:13:02.190430 1160558 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:13:02.191124 1160558 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:13:02.193847 1160558 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:13:02.197226 1160558 out.go:252]   - Booting up control plane ...
	I1218 00:13:02.197323 1160558 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:13:02.197405 1160558 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:13:02.197473 1160558 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:13:02.212130 1160558 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:13:02.212276 1160558 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:13:02.220546 1160558 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:13:02.220827 1160558 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:13:02.221014 1160558 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:13:02.367718 1160558 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:13:02.367845 1160558 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:13:02.871530 1160558 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 501.745896ms
	I1218 00:13:02.873845 1160558 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1218 00:13:02.874104 1160558 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.49.2:8443/livez
	I1218 00:13:02.874199 1160558 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1218 00:13:02.874277 1160558 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1218 00:13:05.937033 1160558 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 3.061615159s
	I1218 00:13:07.416710 1160558 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 4.541717198s
	I1218 00:13:08.876605 1160558 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 6.00134908s
	I1218 00:13:08.910569 1160558 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1218 00:13:08.924720 1160558 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1218 00:13:08.938970 1160558 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1218 00:13:08.939170 1160558 kubeadm.go:319] [mark-control-plane] Marking the node addons-399099 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1218 00:13:08.950640 1160558 kubeadm.go:319] [bootstrap-token] Using token: 2958n9.ll17if4da92gcu4g
	I1218 00:13:08.953625 1160558 out.go:252]   - Configuring RBAC rules ...
	I1218 00:13:08.953755 1160558 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1218 00:13:08.957992 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1218 00:13:08.967908 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1218 00:13:08.974957 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1218 00:13:08.979252 1160558 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1218 00:13:08.983385 1160558 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1218 00:13:09.284621 1160558 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1218 00:13:09.716822 1160558 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1218 00:13:10.284367 1160558 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1218 00:13:10.285505 1160558 kubeadm.go:319] 
	I1218 00:13:10.285581 1160558 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1218 00:13:10.285586 1160558 kubeadm.go:319] 
	I1218 00:13:10.285659 1160558 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1218 00:13:10.285663 1160558 kubeadm.go:319] 
	I1218 00:13:10.285687 1160558 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1218 00:13:10.285793 1160558 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1218 00:13:10.285845 1160558 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1218 00:13:10.285849 1160558 kubeadm.go:319] 
	I1218 00:13:10.285909 1160558 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1218 00:13:10.285914 1160558 kubeadm.go:319] 
	I1218 00:13:10.285984 1160558 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1218 00:13:10.285999 1160558 kubeadm.go:319] 
	I1218 00:13:10.286125 1160558 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1218 00:13:10.286224 1160558 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1218 00:13:10.286331 1160558 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1218 00:13:10.286340 1160558 kubeadm.go:319] 
	I1218 00:13:10.286440 1160558 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1218 00:13:10.286540 1160558 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1218 00:13:10.286550 1160558 kubeadm.go:319] 
	I1218 00:13:10.286657 1160558 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token 2958n9.ll17if4da92gcu4g \
	I1218 00:13:10.286819 1160558 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:18af86f6ab3e0657d733c8936184202396957856d244f2643507ab37d928e53b \
	I1218 00:13:10.286846 1160558 kubeadm.go:319] 	--control-plane 
	I1218 00:13:10.286855 1160558 kubeadm.go:319] 
	I1218 00:13:10.286966 1160558 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1218 00:13:10.286975 1160558 kubeadm.go:319] 
	I1218 00:13:10.287076 1160558 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token 2958n9.ll17if4da92gcu4g \
	I1218 00:13:10.287227 1160558 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:18af86f6ab3e0657d733c8936184202396957856d244f2643507ab37d928e53b 
	I1218 00:13:10.290707 1160558 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1218 00:13:10.291016 1160558 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:13:10.291162 1160558 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:13:10.291193 1160558 cni.go:84] Creating CNI manager for ""
	I1218 00:13:10.291217 1160558 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:13:10.296432 1160558 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1218 00:13:10.299477 1160558 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1218 00:13:10.303965 1160558 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.3/kubectl ...
	I1218 00:13:10.303987 1160558 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2620 bytes)
	I1218 00:13:10.319641 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1218 00:13:10.607349 1160558 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1218 00:13:10.607500 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:10.607583 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes addons-399099 minikube.k8s.io/updated_at=2025_12_18T00_13_10_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=2e96f676eb7e96389e85fe0658a4ede4c4ba6924 minikube.k8s.io/name=addons-399099 minikube.k8s.io/primary=true
	I1218 00:13:10.744966 1160558 ops.go:34] apiserver oom_adj: -16
	I1218 00:13:10.745096 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:11.246120 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:11.745204 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:12.245453 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:12.745201 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:13.245235 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:13.745703 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:14.245126 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:14.745440 1160558 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1218 00:13:14.866125 1160558 kubeadm.go:1114] duration metric: took 4.258692816s to wait for elevateKubeSystemPrivileges
	I1218 00:13:14.866156 1160558 kubeadm.go:403] duration metric: took 20.909291982s to StartCluster
	I1218 00:13:14.866173 1160558 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:13:14.866294 1160558 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:13:14.866715 1160558 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:13:14.866897 1160558 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:13:14.867070 1160558 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1218 00:13:14.867320 1160558 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:13:14.867348 1160558 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:true auto-pause:false cloud-spanner:true csi-hostpath-driver:true dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:true gvisor:false headlamp:false inaccel:false ingress:true ingress-dns:true inspektor-gadget:true istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:true nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:true registry-aliases:false registry-creds:true storage-provisioner:true storage-provisioner-rancher:true volcano:true volumesnapshots:true yakd:true]
	I1218 00:13:14.867432 1160558 addons.go:70] Setting yakd=true in profile "addons-399099"
	I1218 00:13:14.867446 1160558 addons.go:239] Setting addon yakd=true in "addons-399099"
	I1218 00:13:14.867467 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.867949 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.868493 1160558 addons.go:70] Setting metrics-server=true in profile "addons-399099"
	I1218 00:13:14.868510 1160558 addons.go:239] Setting addon metrics-server=true in "addons-399099"
	I1218 00:13:14.868534 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.868935 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.869175 1160558 addons.go:70] Setting nvidia-device-plugin=true in profile "addons-399099"
	I1218 00:13:14.869206 1160558 addons.go:239] Setting addon nvidia-device-plugin=true in "addons-399099"
	I1218 00:13:14.869254 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.870516 1160558 addons.go:70] Setting registry=true in profile "addons-399099"
	I1218 00:13:14.870533 1160558 addons.go:239] Setting addon registry=true in "addons-399099"
	I1218 00:13:14.870553 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.871058 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.871952 1160558 addons.go:70] Setting registry-creds=true in profile "addons-399099"
	I1218 00:13:14.871997 1160558 addons.go:239] Setting addon registry-creds=true in "addons-399099"
	I1218 00:13:14.872046 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.872611 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.873356 1160558 addons.go:70] Setting amd-gpu-device-plugin=true in profile "addons-399099"
	I1218 00:13:14.873439 1160558 addons.go:239] Setting addon amd-gpu-device-plugin=true in "addons-399099"
	I1218 00:13:14.873510 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.874249 1160558 addons.go:70] Setting cloud-spanner=true in profile "addons-399099"
	I1218 00:13:14.874267 1160558 addons.go:239] Setting addon cloud-spanner=true in "addons-399099"
	I1218 00:13:14.874296 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.874862 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.880033 1160558 addons.go:70] Setting storage-provisioner=true in profile "addons-399099"
	I1218 00:13:14.880107 1160558 addons.go:239] Setting addon storage-provisioner=true in "addons-399099"
	I1218 00:13:14.880154 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.880733 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.888655 1160558 addons.go:70] Setting storage-provisioner-rancher=true in profile "addons-399099"
	I1218 00:13:14.888682 1160558 addons_storage_classes.go:34] enableOrDisableStorageClasses storage-provisioner-rancher=true on "addons-399099"
	I1218 00:13:14.889003 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.890214 1160558 addons.go:70] Setting csi-hostpath-driver=true in profile "addons-399099"
	I1218 00:13:14.890277 1160558 addons.go:239] Setting addon csi-hostpath-driver=true in "addons-399099"
	I1218 00:13:14.890303 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.890725 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.899984 1160558 addons.go:70] Setting volcano=true in profile "addons-399099"
	I1218 00:13:14.900015 1160558 addons.go:239] Setting addon volcano=true in "addons-399099"
	I1218 00:13:14.900102 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.900744 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.902412 1160558 addons.go:70] Setting default-storageclass=true in profile "addons-399099"
	I1218 00:13:14.902437 1160558 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "addons-399099"
	I1218 00:13:14.902884 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.920361 1160558 addons.go:70] Setting volumesnapshots=true in profile "addons-399099"
	I1218 00:13:14.920392 1160558 addons.go:239] Setting addon volumesnapshots=true in "addons-399099"
	I1218 00:13:14.920426 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.920895 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.921064 1160558 addons.go:70] Setting gcp-auth=true in profile "addons-399099"
	I1218 00:13:14.921083 1160558 mustload.go:66] Loading cluster: addons-399099
	I1218 00:13:14.921258 1160558 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:13:14.921478 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.937180 1160558 addons.go:70] Setting ingress=true in profile "addons-399099"
	I1218 00:13:14.937212 1160558 addons.go:239] Setting addon ingress=true in "addons-399099"
	I1218 00:13:14.937252 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.937817 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.940660 1160558 out.go:179]   - Using image docker.io/marcnuri/yakd:0.0.6
	I1218 00:13:14.944600 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-ns.yaml
	I1218 00:13:14.944626 1160558 ssh_runner.go:362] scp yakd/yakd-ns.yaml --> /etc/kubernetes/addons/yakd-ns.yaml (171 bytes)
	I1218 00:13:14.944691 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:14.964068 1160558 addons.go:70] Setting ingress-dns=true in profile "addons-399099"
	I1218 00:13:14.964103 1160558 addons.go:239] Setting addon ingress-dns=true in "addons-399099"
	I1218 00:13:14.964159 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:14.964768 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.971787 1160558 out.go:179] * Verifying Kubernetes components...
	I1218 00:13:14.975756 1160558 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:13:14.975890 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:14.980937 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.018087 1160558 addons.go:70] Setting inspektor-gadget=true in profile "addons-399099"
	I1218 00:13:15.018129 1160558 addons.go:239] Setting addon inspektor-gadget=true in "addons-399099"
	I1218 00:13:15.018179 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.018696 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.059598 1160558 out.go:179]   - Using image registry.k8s.io/metrics-server/metrics-server:v0.8.0
	I1218 00:13:15.063266 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I1218 00:13:15.063298 1160558 ssh_runner.go:362] scp metrics-server/metrics-apiservice.yaml --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I1218 00:13:15.063367 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.113619 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.138871 1160558 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:13:15.145504 1160558 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:13:15.145572 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:13:15.145678 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.164197 1160558 addons.go:239] Setting addon storage-provisioner-rancher=true in "addons-399099"
	I1218 00:13:15.164302 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.164735 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.166589 1160558 out.go:179]   - Using image nvcr.io/nvidia/k8s-device-plugin:v0.18.1
	I1218 00:13:15.167357 1160558 out.go:179]   - Using image docker.io/registry:3.0.0
	I1218 00:13:15.183044 1160558 out.go:179]   - Using image gcr.io/cloud-spanner-emulator/emulator:1.5.45
	I1218 00:13:15.186532 1160558 addons.go:436] installing /etc/kubernetes/addons/deployment.yaml
	I1218 00:13:15.186557 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/deployment.yaml (1004 bytes)
	I1218 00:13:15.186621 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.207119 1160558 out.go:179]   - Using image gcr.io/k8s-minikube/kube-registry-proxy:0.0.9
	I1218 00:13:15.208083 1160558 out.go:179]   - Using image docker.io/upmcenterprises/registry-creds:1.10
	I1218 00:13:15.208327 1160558 addons.go:436] installing /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I1218 00:13:15.208350 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/nvidia-device-plugin.yaml (1966 bytes)
	I1218 00:13:15.208420 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.212082 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.214707 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0
	I1218 00:13:15.215886 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-creds-rc.yaml
	I1218 00:13:15.215902 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-creds-rc.yaml (3306 bytes)
	I1218 00:13:15.215958 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.246310 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-rc.yaml
	I1218 00:13:15.246330 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-rc.yaml (860 bytes)
	I1218 00:13:15.246390 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.255949 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/snapshot-controller:v6.1.0
	I1218 00:13:15.256340 1160558 out.go:179]   - Using image docker.io/rocm/k8s-device-plugin:1.25.2.8
	I1218 00:13:15.277823 1160558 addons.go:436] installing /etc/kubernetes/addons/amd-gpu-device-plugin.yaml
	I1218 00:13:15.288419 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/amd-gpu-device-plugin.yaml (1868 bytes)
	I1218 00:13:15.288587 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.293997 1160558 addons.go:239] Setting addon default-storageclass=true in "addons-399099"
	I1218 00:13:15.294038 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:15.294538 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:15.310323 1160558 out.go:179]   - Using image docker.io/kicbase/minikube-ingress-dns:0.0.4
	I1218 00:13:15.315525 1160558 addons.go:436] installing /etc/kubernetes/addons/ingress-dns-pod.yaml
	I1218 00:13:15.315548 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-dns-pod.yaml (2889 bytes)
	I1218 00:13:15.315627 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.323609 1160558 out.go:179]   - Using image ghcr.io/inspektor-gadget/inspektor-gadget:v0.47.0
	W1218 00:13:15.282436 1160558 out.go:285] ! Enabling 'volcano' returned an error: running callbacks: [volcano addon does not support crio]
	I1218 00:13:15.323752 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-provisioner:v3.3.0
	I1218 00:13:15.323835 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/controller:v1.14.1
	I1218 00:13:15.323990 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml
	I1218 00:13:15.325886 1160558 ssh_runner.go:362] scp volumesnapshots/csi-hostpath-snapshotclass.yaml --> /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml (934 bytes)
	I1218 00:13:15.325957 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.347914 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-attacher:v4.0.0
	I1218 00:13:15.356213 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-external-health-monitor-controller:v0.7.0
	I1218 00:13:15.360936 1160558 addons.go:436] installing /etc/kubernetes/addons/ig-deployment.yaml
	I1218 00:13:15.360962 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-deployment.yaml (15034 bytes)
	I1218 00:13:15.361028 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.361517 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.373183 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.6.0
	I1218 00:13:15.373339 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.5
	I1218 00:13:15.373543 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.381017 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.5
	I1218 00:13:15.381208 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/hostpathplugin:v1.9.0
	I1218 00:13:15.384952 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/livenessprobe:v2.8.0
	I1218 00:13:15.387861 1160558 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-resizer:v1.6.0
	I1218 00:13:15.390743 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-attacher.yaml
	I1218 00:13:15.390762 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-attacher.yaml --> /etc/kubernetes/addons/rbac-external-attacher.yaml (3073 bytes)
	I1218 00:13:15.390956 1160558 addons.go:436] installing /etc/kubernetes/addons/ingress-deploy.yaml
	I1218 00:13:15.390992 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-deploy.yaml (16078 bytes)
	I1218 00:13:15.391084 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.400602 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.420479 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.426494 1160558 out.go:179]   - Using image docker.io/rancher/local-path-provisioner:v0.0.22
	I1218 00:13:15.428879 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.429675 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.432962 1160558 out.go:179]   - Using image docker.io/busybox:stable
	I1218 00:13:15.435979 1160558 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I1218 00:13:15.436001 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner-rancher.yaml (3113 bytes)
	I1218 00:13:15.436065 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.464979 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.501964 1160558 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:13:15.501984 1160558 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:13:15.502048 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:15.511905 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.512971 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.527508 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.555601 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.567823 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.568411 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:15.576935 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	W1218 00:13:15.577467 1160558 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1218 00:13:15.577489 1160558 retry.go:31] will retry after 136.553495ms: ssh: handshake failed: EOF
	I1218 00:13:15.591535 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	W1218 00:13:15.593042 1160558 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1218 00:13:15.593068 1160558 retry.go:31] will retry after 292.605626ms: ssh: handshake failed: EOF
	W1218 00:13:15.717522 1160558 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1218 00:13:15.717596 1160558 retry.go:31] will retry after 415.307284ms: ssh: handshake failed: EOF
	I1218 00:13:15.832749 1160558 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:13:15.833011 1160558 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1218 00:13:16.128489 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-sa.yaml
	I1218 00:13:16.128515 1160558 ssh_runner.go:362] scp yakd/yakd-sa.yaml --> /etc/kubernetes/addons/yakd-sa.yaml (247 bytes)
	I1218 00:13:16.311097 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-crb.yaml
	I1218 00:13:16.311121 1160558 ssh_runner.go:362] scp yakd/yakd-crb.yaml --> /etc/kubernetes/addons/yakd-crb.yaml (422 bytes)
	I1218 00:13:16.344708 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ig-deployment.yaml
	I1218 00:13:16.375410 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I1218 00:13:16.375431 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1907 bytes)
	I1218 00:13:16.433483 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I1218 00:13:16.479377 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/deployment.yaml
	I1218 00:13:16.492437 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:13:16.571673 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I1218 00:13:16.577225 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-hostpath.yaml
	I1218 00:13:16.577251 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-hostpath.yaml --> /etc/kubernetes/addons/rbac-hostpath.yaml (4266 bytes)
	I1218 00:13:16.654419 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-svc.yaml
	I1218 00:13:16.654445 1160558 ssh_runner.go:362] scp yakd/yakd-svc.yaml --> /etc/kubernetes/addons/yakd-svc.yaml (412 bytes)
	I1218 00:13:16.663713 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-creds-rc.yaml
	I1218 00:13:16.668168 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml
	I1218 00:13:16.683117 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/amd-gpu-device-plugin.yaml
	I1218 00:13:16.689100 1160558 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml
	I1218 00:13:16.689125 1160558 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotclasses.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml (6471 bytes)
	I1218 00:13:16.693522 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-svc.yaml
	I1218 00:13:16.693544 1160558 ssh_runner.go:362] scp registry/registry-svc.yaml --> /etc/kubernetes/addons/registry-svc.yaml (398 bytes)
	I1218 00:13:16.763631 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I1218 00:13:16.763654 1160558 ssh_runner.go:362] scp metrics-server/metrics-server-rbac.yaml --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I1218 00:13:16.786734 1160558 addons.go:436] installing /etc/kubernetes/addons/yakd-dp.yaml
	I1218 00:13:16.786758 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-dp.yaml (2017 bytes)
	I1218 00:13:16.833328 1160558 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml
	I1218 00:13:16.833353 1160558 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotcontents.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml (23126 bytes)
	I1218 00:13:16.897988 1160558 addons.go:436] installing /etc/kubernetes/addons/registry-proxy.yaml
	I1218 00:13:16.898011 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-proxy.yaml (947 bytes)
	I1218 00:13:17.050754 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml
	I1218 00:13:17.050779 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-health-monitor-controller.yaml --> /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml (3038 bytes)
	I1218 00:13:17.090411 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:13:17.098957 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml
	I1218 00:13:17.141139 1160558 addons.go:436] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I1218 00:13:17.141167 1160558 ssh_runner.go:362] scp metrics-server/metrics-server-service.yaml --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I1218 00:13:17.164133 1160558 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml
	I1218 00:13:17.164160 1160558 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshots.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml (19582 bytes)
	I1218 00:13:17.209778 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml
	I1218 00:13:17.250996 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml
	I1218 00:13:17.289090 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-provisioner.yaml
	I1218 00:13:17.289116 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-provisioner.yaml --> /etc/kubernetes/addons/rbac-external-provisioner.yaml (4442 bytes)
	I1218 00:13:17.373114 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I1218 00:13:17.415857 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml
	I1218 00:13:17.415883 1160558 ssh_runner.go:362] scp volumesnapshots/rbac-volume-snapshot-controller.yaml --> /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml (3545 bytes)
	I1218 00:13:17.580365 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-resizer.yaml
	I1218 00:13:17.580394 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-resizer.yaml --> /etc/kubernetes/addons/rbac-external-resizer.yaml (2943 bytes)
	I1218 00:13:17.666250 1160558 addons.go:436] installing /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1218 00:13:17.666273 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml (1475 bytes)
	I1218 00:13:17.739054 1160558 addons.go:436] installing /etc/kubernetes/addons/rbac-external-snapshotter.yaml
	I1218 00:13:17.739080 1160558 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-snapshotter.yaml --> /etc/kubernetes/addons/rbac-external-snapshotter.yaml (3149 bytes)
	I1218 00:13:17.785843 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-attacher.yaml
	I1218 00:13:17.785868 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-attacher.yaml (2143 bytes)
	I1218 00:13:18.134700 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1218 00:13:18.144384 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml
	I1218 00:13:18.144409 1160558 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-driverinfo.yaml --> /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml (1274 bytes)
	I1218 00:13:18.482258 1160558 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (2.649199057s)
	I1218 00:13:18.482289 1160558 start.go:977] {"host.minikube.internal": 192.168.49.1} host record injected into CoreDNS's ConfigMap
	I1218 00:13:18.483233 1160558 ssh_runner.go:235] Completed: sudo systemctl start kubelet: (2.650397883s)
	I1218 00:13:18.483964 1160558 node_ready.go:35] waiting up to 6m0s for node "addons-399099" to be "Ready" ...
	I1218 00:13:18.543606 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-plugin.yaml
	I1218 00:13:18.543667 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-plugin.yaml (8201 bytes)
	I1218 00:13:18.762791 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-resizer.yaml
	I1218 00:13:18.762858 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-resizer.yaml (2191 bytes)
	I1218 00:13:18.986277 1160558 kapi.go:214] "coredns" deployment in "kube-system" namespace and "addons-399099" context rescaled to 1 replicas
	I1218 00:13:19.039721 1160558 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I1218 00:13:19.039743 1160558 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-storageclass.yaml --> /etc/kubernetes/addons/csi-hostpath-storageclass.yaml (846 bytes)
	I1218 00:13:19.056976 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	W1218 00:13:20.525538 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:20.678106 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ig-deployment.yaml: (4.333286912s)
	I1218 00:13:20.678159 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml: (4.24461104s)
	I1218 00:13:20.678198 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/deployment.yaml: (4.198798563s)
	I1218 00:13:20.932203 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (4.439731351s)
	I1218 00:13:21.058080 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml: (4.486368s)
	I1218 00:13:21.058334 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-creds-rc.yaml: (4.394592528s)
	I1218 00:13:21.352972 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml: (4.684764819s)
	I1218 00:13:21.353021 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/amd-gpu-device-plugin.yaml: (4.66987921s)
	I1218 00:13:21.353044 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (4.262611682s)
	I1218 00:13:21.420861 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml: (4.211015212s)
	I1218 00:13:21.420944 1160558 addons.go:495] Verifying addon registry=true in "addons-399099"
	I1218 00:13:21.420983 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml: (4.321997708s)
	I1218 00:13:21.426093 1160558 out.go:179] * To access YAKD - Kubernetes Dashboard, wait for Pod to be ready and run the following command:
	
		minikube -p addons-399099 service yakd-dashboard -n yakd-dashboard
	
	I1218 00:13:21.426210 1160558 out.go:179] * Verifying registry addon...
	I1218 00:13:21.429741 1160558 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=registry" in ns "kube-system" ...
	I1218 00:13:21.442753 1160558 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=registry
	I1218 00:13:21.442772 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:21.940323 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:22.179905 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (4.806744294s)
	I1218 00:13:22.179977 1160558 addons.go:495] Verifying addon metrics-server=true in "addons-399099"
	I1218 00:13:22.180094 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (4.045361931s)
	W1218 00:13:22.180134 1160558 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I1218 00:13:22.180163 1160558 retry.go:31] will retry after 152.467875ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I1218 00:13:22.180350 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml: (4.929323768s)
	I1218 00:13:22.180381 1160558 addons.go:495] Verifying addon ingress=true in "addons-399099"
	I1218 00:13:22.183637 1160558 out.go:179] * Verifying ingress addon...
	I1218 00:13:22.187116 1160558 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=ingress-nginx" in ns "ingress-nginx" ...
	I1218 00:13:22.194137 1160558 kapi.go:86] Found 3 Pods for label selector app.kubernetes.io/name=ingress-nginx
	I1218 00:13:22.194158 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:22.333679 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1218 00:13:22.438102 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:22.443837 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml: (3.386812222s)
	I1218 00:13:22.443868 1160558 addons.go:495] Verifying addon csi-hostpath-driver=true in "addons-399099"
	I1218 00:13:22.446780 1160558 out.go:179] * Verifying csi-hostpath-driver addon...
	I1218 00:13:22.450275 1160558 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
	I1218 00:13:22.465849 1160558 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I1218 00:13:22.465871 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:22.696482 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:22.872580 1160558 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_application_credentials.json (162 bytes)
	I1218 00:13:22.872686 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:22.888891 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:22.933942 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:22.953714 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	W1218 00:13:22.987698 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:23.028920 1160558 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_cloud_project (12 bytes)
	I1218 00:13:23.042429 1160558 addons.go:239] Setting addon gcp-auth=true in "addons-399099"
	I1218 00:13:23.042477 1160558 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:13:23.042956 1160558 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:13:23.059867 1160558 ssh_runner.go:195] Run: cat /var/lib/minikube/google_application_credentials.json
	I1218 00:13:23.059932 1160558 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:13:23.077686 1160558 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:13:23.190629 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:23.432619 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:23.453343 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:23.690783 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:23.933375 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:23.953167 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:24.191101 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:24.433617 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:24.453183 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:24.690896 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:24.933554 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:24.954039 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:25.052181 1160558 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (2.718457288s)
	I1218 00:13:25.052259 1160558 ssh_runner.go:235] Completed: cat /var/lib/minikube/google_application_credentials.json: (1.992366855s)
	I1218 00:13:25.055677 1160558 out.go:179]   - Using image gcr.io/k8s-minikube/gcp-auth-webhook:v0.1.3
	I1218 00:13:25.058563 1160558 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.5
	I1218 00:13:25.061445 1160558 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-ns.yaml
	I1218 00:13:25.061474 1160558 ssh_runner.go:362] scp gcp-auth/gcp-auth-ns.yaml --> /etc/kubernetes/addons/gcp-auth-ns.yaml (700 bytes)
	I1218 00:13:25.076075 1160558 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-service.yaml
	I1218 00:13:25.076098 1160558 ssh_runner.go:362] scp gcp-auth/gcp-auth-service.yaml --> /etc/kubernetes/addons/gcp-auth-service.yaml (788 bytes)
	I1218 00:13:25.090351 1160558 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I1218 00:13:25.090380 1160558 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-webhook.yaml (5421 bytes)
	I1218 00:13:25.104862 1160558 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I1218 00:13:25.191302 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:25.432994 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:25.454697 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	W1218 00:13:25.487792 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:25.612099 1160558 addons.go:495] Verifying addon gcp-auth=true in "addons-399099"
	I1218 00:13:25.615147 1160558 out.go:179] * Verifying gcp-auth addon...
	I1218 00:13:25.618778 1160558 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=gcp-auth" in ns "gcp-auth" ...
	I1218 00:13:25.627384 1160558 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I1218 00:13:25.627412 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:25.730152 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:25.933175 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:25.953785 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:26.122388 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:26.190141 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:26.432758 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:26.453492 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:26.621742 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:26.690987 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:26.932584 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:26.953143 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:27.122328 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:27.190037 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:27.433170 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:27.453989 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:27.622593 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:27.690423 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:27.933546 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:27.953274 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	W1218 00:13:27.986925 1160558 node_ready.go:57] node "addons-399099" has "Ready":"False" status (will retry)
	I1218 00:13:28.122107 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:28.191024 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:28.433235 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:28.453038 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:28.622130 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:28.723461 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:28.970714 1160558 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=registry
	I1218 00:13:28.970739 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:29.000901 1160558 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I1218 00:13:29.000930 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:29.008723 1160558 node_ready.go:49] node "addons-399099" is "Ready"
	I1218 00:13:29.008758 1160558 node_ready.go:38] duration metric: took 10.524766521s for node "addons-399099" to be "Ready" ...
	I1218 00:13:29.008772 1160558 api_server.go:52] waiting for apiserver process to appear ...
	I1218 00:13:29.008835 1160558 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:13:29.030227 1160558 api_server.go:72] duration metric: took 14.163302322s to wait for apiserver process to appear ...
	I1218 00:13:29.030257 1160558 api_server.go:88] waiting for apiserver healthz status ...
	I1218 00:13:29.030279 1160558 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8443/healthz ...
	I1218 00:13:29.073596 1160558 api_server.go:279] https://192.168.49.2:8443/healthz returned 200:
	ok
	I1218 00:13:29.079314 1160558 api_server.go:141] control plane version: v1.34.3
	I1218 00:13:29.079346 1160558 api_server.go:131] duration metric: took 49.080913ms to wait for apiserver health ...
	I1218 00:13:29.079356 1160558 system_pods.go:43] waiting for kube-system pods to appear ...
	I1218 00:13:29.108523 1160558 system_pods.go:59] 19 kube-system pods found
	I1218 00:13:29.108565 1160558 system_pods.go:61] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending
	I1218 00:13:29.108572 1160558 system_pods.go:61] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending
	I1218 00:13:29.108577 1160558 system_pods.go:61] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending
	I1218 00:13:29.108580 1160558 system_pods.go:61] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending
	I1218 00:13:29.108584 1160558 system_pods.go:61] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.108587 1160558 system_pods.go:61] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.108591 1160558 system_pods.go:61] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.108595 1160558 system_pods.go:61] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.108599 1160558 system_pods.go:61] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending
	I1218 00:13:29.108603 1160558 system_pods.go:61] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.108606 1160558 system_pods.go:61] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.108610 1160558 system_pods.go:61] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending
	I1218 00:13:29.108614 1160558 system_pods.go:61] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending
	I1218 00:13:29.108620 1160558 system_pods.go:61] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending
	I1218 00:13:29.108625 1160558 system_pods.go:61] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.108636 1160558 system_pods.go:61] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending
	I1218 00:13:29.108639 1160558 system_pods.go:61] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.108643 1160558 system_pods.go:61] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.108650 1160558 system_pods.go:61] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.108665 1160558 system_pods.go:74] duration metric: took 29.3037ms to wait for pod list to return data ...
	I1218 00:13:29.108675 1160558 default_sa.go:34] waiting for default service account to be created ...
	I1218 00:13:29.116548 1160558 default_sa.go:45] found service account: "default"
	I1218 00:13:29.116579 1160558 default_sa.go:55] duration metric: took 7.892262ms for default service account to be created ...
	I1218 00:13:29.116589 1160558 system_pods.go:116] waiting for k8s-apps to be running ...
	I1218 00:13:29.141933 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:29.142696 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:29.142730 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending
	I1218 00:13:29.142737 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending
	I1218 00:13:29.142747 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending
	I1218 00:13:29.142751 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending
	I1218 00:13:29.142756 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.142761 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.142766 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.142777 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.142781 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending
	I1218 00:13:29.142792 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.142797 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.142801 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending
	I1218 00:13:29.142814 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending
	I1218 00:13:29.142818 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending
	I1218 00:13:29.142821 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.142825 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending
	I1218 00:13:29.142829 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.142834 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.142848 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.142867 1160558 retry.go:31] will retry after 202.508545ms: missing components: kube-dns
	I1218 00:13:29.218820 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:29.388158 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:29.388199 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:29.388207 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending
	I1218 00:13:29.388215 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:29.388292 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending
	I1218 00:13:29.388308 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.388314 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.388319 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.388330 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.388337 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending
	I1218 00:13:29.388341 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.388350 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.388367 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending
	I1218 00:13:29.388379 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending
	I1218 00:13:29.388383 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending
	I1218 00:13:29.388401 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.388412 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending
	I1218 00:13:29.388417 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.388421 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.388427 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.388447 1160558 retry.go:31] will retry after 255.169438ms: missing components: kube-dns
	I1218 00:13:29.448447 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:29.473924 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:29.629343 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:29.652795 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:29.652834 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:29.652843 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:29.652851 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:29.652857 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:29.652862 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:29.652867 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:29.652871 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:29.652876 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:29.652887 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:29.652891 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:29.652896 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:29.652902 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:29.652916 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:29.652923 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:29.652934 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending
	I1218 00:13:29.652940 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:29.652944 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending
	I1218 00:13:29.652948 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending
	I1218 00:13:29.652954 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1218 00:13:29.652971 1160558 retry.go:31] will retry after 373.193664ms: missing components: kube-dns
	I1218 00:13:29.692383 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:29.934326 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:30.040866 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:30.040923 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:30.040936 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:30.040945 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:30.040953 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:30.040962 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:30.040968 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:30.040989 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:30.041002 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:30.041010 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:30.041020 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:30.041025 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:30.041034 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:30.041044 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:30.041050 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:30.041065 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:30.041073 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:30.042159 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:30.042242 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.042776 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.042791 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:30.042811 1160558 retry.go:31] will retry after 524.226404ms: missing components: kube-dns
	I1218 00:13:30.135743 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:30.191213 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:30.433634 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:30.454234 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:30.572011 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:30.572094 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:30.572121 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:30.572144 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:30.572167 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:30.572188 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:30.572208 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:30.572254 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:30.572274 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:30.572296 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:30.572315 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:30.572335 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:30.572359 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:30.572381 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:30.572403 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:30.572447 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:30.572469 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:30.572490 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.572524 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:30.572549 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:30.572579 1160558 retry.go:31] will retry after 553.713784ms: missing components: kube-dns
	I1218 00:13:30.622724 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:30.691185 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:30.935679 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:30.966001 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:31.122607 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:31.132313 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:31.132401 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 00:13:31.132426 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:31.132450 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:31.132472 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:31.132493 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:31.132516 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:31.132537 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:31.132557 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:31.132581 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:31.132602 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:31.132621 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:31.132645 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:31.132678 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:31.132705 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:31.132727 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:31.132757 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:31.132780 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.132802 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.132821 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:31.132859 1160558 retry.go:31] will retry after 742.024058ms: missing components: kube-dns
	I1218 00:13:31.190846 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:31.433452 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:31.454121 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:31.622470 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:31.690576 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:31.887511 1160558 system_pods.go:86] 19 kube-system pods found
	I1218 00:13:31.887545 1160558 system_pods.go:89] "coredns-66bc5c9577-clntp" [4bb34bcf-fb42-4982-9481-ab3ad5363555] Running
	I1218 00:13:31.887557 1160558 system_pods.go:89] "csi-hostpath-attacher-0" [096194bb-5720-43c3-947f-f19a1866e25e] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1218 00:13:31.887565 1160558 system_pods.go:89] "csi-hostpath-resizer-0" [2208ffad-c00b-488a-9450-ca6325fbbafa] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1218 00:13:31.887574 1160558 system_pods.go:89] "csi-hostpathplugin-5v2nz" [dc0308e9-9513-4cf3-bd8e-61d46d16ba68] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1218 00:13:31.887580 1160558 system_pods.go:89] "etcd-addons-399099" [189be4a2-7f7a-4868-9b5a-952a95d625a8] Running
	I1218 00:13:31.887585 1160558 system_pods.go:89] "kindnet-gxdvh" [8b8e2896-931d-41d5-ae13-d615cf899685] Running
	I1218 00:13:31.887595 1160558 system_pods.go:89] "kube-apiserver-addons-399099" [1cb73fde-3a3a-4b6f-8257-6373ce717cac] Running
	I1218 00:13:31.887600 1160558 system_pods.go:89] "kube-controller-manager-addons-399099" [ca57dfc3-a23f-43c4-8181-40876fc3f686] Running
	I1218 00:13:31.887612 1160558 system_pods.go:89] "kube-ingress-dns-minikube" [4f9fe5d4-8542-4087-963d-899a95ce9de9] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1218 00:13:31.887618 1160558 system_pods.go:89] "kube-proxy-7lfkl" [1bbd2411-ec02-477e-bfad-b2cc2aefedc4] Running
	I1218 00:13:31.887628 1160558 system_pods.go:89] "kube-scheduler-addons-399099" [d3678f23-cbe3-4722-a118-cbc616649a23] Running
	I1218 00:13:31.887634 1160558 system_pods.go:89] "metrics-server-85b7d694d7-b7rjb" [e18cbb94-8598-4aac-9d6f-63de1a142379] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1218 00:13:31.887641 1160558 system_pods.go:89] "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1218 00:13:31.887651 1160558 system_pods.go:89] "registry-6b586f9694-k4nhf" [53c33f12-7aef-4450-9f23-e95b879739cb] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1218 00:13:31.887659 1160558 system_pods.go:89] "registry-creds-764b6fb674-txh6b" [df78a81d-bc8f-4646-b2bc-3b30c7fe0f44] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1218 00:13:31.887669 1160558 system_pods.go:89] "registry-proxy-p5q9s" [f90dbfd2-e34d-4701-8e20-0f68e282c12c] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1218 00:13:31.887676 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-knbsf" [6e4d2ab3-6039-4f53-81f6-212f2701db84] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.887685 1160558 system_pods.go:89] "snapshot-controller-7d9fbc56b8-kzs8c" [99fce1ac-873c-41ce-9c8d-d67026cb693c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1218 00:13:31.887690 1160558 system_pods.go:89] "storage-provisioner" [76d99925-fdcf-45c0-9c2a-5e2a91baa077] Running
	I1218 00:13:31.887701 1160558 system_pods.go:126] duration metric: took 2.771105141s to wait for k8s-apps to be running ...
	I1218 00:13:31.887712 1160558 system_svc.go:44] waiting for kubelet service to be running ....
	I1218 00:13:31.887766 1160558 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:13:31.919555 1160558 system_svc.go:56] duration metric: took 31.833999ms WaitForService to wait for kubelet
	I1218 00:13:31.919588 1160558 kubeadm.go:587] duration metric: took 17.052669561s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:13:31.919607 1160558 node_conditions.go:102] verifying NodePressure condition ...
	I1218 00:13:31.937360 1160558 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1218 00:13:31.937392 1160558 node_conditions.go:123] node cpu capacity is 2
	I1218 00:13:31.937406 1160558 node_conditions.go:105] duration metric: took 17.79321ms to run NodePressure ...
	I1218 00:13:31.937420 1160558 start.go:242] waiting for startup goroutines ...
	I1218 00:13:31.937645 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:31.954156 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:32.121984 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:32.191384 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:32.433824 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:32.453831 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:32.623072 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:32.690627 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:32.933092 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:32.954219 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:33.122645 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:33.191332 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:33.433305 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:33.454239 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:33.622583 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:33.691034 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:33.935551 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:33.954076 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:34.124759 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:34.224147 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:34.433903 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:34.454052 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:34.623338 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:34.691404 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:34.939119 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:34.959392 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:35.123220 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:35.191743 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:35.433855 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:35.454376 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:35.622738 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:35.691374 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:35.955877 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:35.971922 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:36.122610 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:36.192124 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:36.433725 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:36.454768 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:36.629057 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:36.691635 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:36.941113 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:36.960863 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:37.121943 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:37.191761 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:37.433115 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:37.454517 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:37.622863 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:37.691066 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:37.933646 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:37.954222 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:38.122453 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:38.191090 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:38.433579 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:38.453853 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:38.621953 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:38.691504 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:38.933405 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:38.953859 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:39.122163 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:39.190415 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:39.433426 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:39.453405 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:39.623005 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:39.691918 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:39.932853 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:39.960342 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:40.123410 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:40.191066 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:40.433024 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:40.455186 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:40.622836 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:40.724262 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:40.933258 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:40.954804 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:41.122173 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:41.191042 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:41.433216 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:41.453270 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:41.621956 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:41.691717 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:41.933670 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:41.953544 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:42.124000 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:42.192128 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:42.437218 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:42.538942 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:42.621974 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:42.691644 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:42.933943 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:42.953895 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:43.122498 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:43.190677 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:43.433669 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:43.459638 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:43.621937 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:43.691427 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:43.933769 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:43.955449 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:44.126970 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:44.191469 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:44.433787 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:44.454140 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:44.622708 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:44.690887 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:44.944200 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:44.979249 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:45.125859 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:45.192372 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:45.434644 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:45.454606 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:45.621728 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:45.693216 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:45.933436 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:45.954080 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:46.122320 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:46.190559 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:46.434159 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:46.454668 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:46.622776 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:46.691473 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:46.934041 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:46.954760 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:47.126868 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:47.190712 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:47.433562 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:47.454320 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:47.622194 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:47.690455 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:47.934614 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:47.955085 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:48.123316 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:48.191254 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:48.433654 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:48.454051 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:48.622586 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:48.691393 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:48.933940 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:48.962096 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:49.122478 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:49.191177 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:49.433669 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:49.454015 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:49.622486 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:49.691442 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:49.933615 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:49.954083 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:50.122420 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:50.191176 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:50.433600 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:50.454019 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:50.624651 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:50.690933 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:50.933721 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:50.953925 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:51.128938 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:51.227035 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:51.433684 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:51.454726 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:51.624706 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:51.725417 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:51.934021 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:51.954674 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:52.121857 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:52.191161 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:52.433715 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:52.454893 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:52.623312 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:52.690774 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:52.934277 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:52.954367 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:53.122434 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:53.190763 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:53.433302 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:53.453756 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:53.621956 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:53.691969 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:53.933824 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:53.954792 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:54.122758 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:54.191389 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:54.434122 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:54.454885 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:54.622220 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:54.694574 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:54.932680 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:54.953924 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:55.122433 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:55.197688 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:55.432897 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:55.454082 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:55.622943 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:55.691713 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:55.933673 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:55.954183 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:56.129820 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:56.191488 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:56.433830 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:56.454625 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:56.622810 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:56.691001 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:56.932525 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:56.953693 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:57.122751 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:57.190759 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:57.433088 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:57.454103 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:57.622471 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:57.690420 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:57.933941 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:57.953807 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:58.128283 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:58.228067 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:58.433482 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:58.453640 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:58.621873 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:58.691753 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:58.933072 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:58.955382 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:59.123064 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:59.191654 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:59.434231 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:59.455406 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:13:59.622734 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:13:59.691079 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:13:59.933877 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:13:59.954285 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:00.125892 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:00.194465 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:00.434836 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:00.454680 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:00.621861 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:00.690909 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:00.932957 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:00.953785 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:01.122117 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:01.191108 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:01.433882 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:01.456333 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:01.622916 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:01.690785 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:01.934247 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:01.954642 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:02.121838 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:02.223048 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:02.433121 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:02.454331 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:02.622019 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:02.691089 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:02.933029 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:02.953921 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:03.122042 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:03.191848 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:03.432824 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:03.454219 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:03.622090 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:03.690414 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:03.933758 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:03.954663 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:04.122076 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:04.191176 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:04.433775 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:04.454667 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:04.622173 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:04.691959 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:04.933964 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:04.954586 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:05.122553 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:05.191511 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:05.446690 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:05.482888 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:05.622258 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:05.691158 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:05.933620 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1218 00:14:05.954080 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:06.122268 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:06.191264 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:06.434196 1160558 kapi.go:107] duration metric: took 45.004457535s to wait for kubernetes.io/minikube-addons=registry ...
	I1218 00:14:06.454272 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:06.622444 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:06.690590 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:06.953635 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:07.122139 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:07.192116 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:07.453252 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:07.622751 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:07.692939 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:07.955649 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:08.121854 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:08.191120 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:08.455904 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:08.622146 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:08.690801 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:08.954691 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:09.121538 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:09.191563 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:09.454288 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:09.622815 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:09.691418 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:09.953509 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:10.123796 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:10.191744 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:10.454912 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:10.622318 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:10.690979 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:10.954332 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:11.124034 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:11.190862 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:11.454635 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:11.621748 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:11.692154 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:11.954393 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:12.122819 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:12.191000 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:12.454525 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:12.622890 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:12.693726 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:12.953668 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:13.121852 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:13.190867 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:13.457814 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:13.631642 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:13.691035 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:13.956059 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:14.123179 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:14.200538 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:14.463553 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:14.622924 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:14.691548 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:14.955141 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:15.121698 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:15.190585 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:15.453519 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:15.622055 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:15.691030 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:15.973041 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:16.122162 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:16.190110 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:16.454411 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:16.622749 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:16.690919 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:16.954333 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:17.122889 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:17.191285 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:17.453995 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:17.621763 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:17.691056 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:17.953740 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:18.121713 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:18.191250 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:18.454480 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:18.622732 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:18.691442 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:18.953925 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:19.121777 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:19.190491 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:19.454892 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:19.628937 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:19.691388 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:19.954134 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:20.122241 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:20.191081 1160558 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1218 00:14:20.454514 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:20.626275 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:20.724777 1160558 kapi.go:107] duration metric: took 58.53766019s to wait for app.kubernetes.io/name=ingress-nginx ...
	I1218 00:14:20.954343 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:21.122344 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:21.454301 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:21.622677 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:21.957066 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:22.122819 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:22.458344 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:22.622072 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:22.954139 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:23.122475 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:23.454592 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:23.622277 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:23.954216 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:24.123266 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1218 00:14:24.454827 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:24.622571 1160558 kapi.go:107] duration metric: took 59.003792385s to wait for kubernetes.io/minikube-addons=gcp-auth ...
	I1218 00:14:24.627728 1160558 out.go:179] * Your GCP credentials will now be mounted into every pod created in the addons-399099 cluster.
	I1218 00:14:24.630920 1160558 out.go:179] * If you don't want your credentials mounted into a specific pod, add a label with the `gcp-auth-skip-secret` key to your pod configuration.
	I1218 00:14:24.635227 1160558 out.go:179] * If you want existing pods to be mounted with credentials, either recreate them or rerun addons enable with --refresh.
	I1218 00:14:24.954308 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:25.455353 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:25.957807 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:26.453987 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:26.953381 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:27.454023 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:27.953510 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:28.453667 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:28.954079 1160558 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1218 00:14:29.453688 1160558 kapi.go:107] duration metric: took 1m7.003416868s to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
	I1218 00:14:29.458297 1160558 out.go:179] * Enabled addons: inspektor-gadget, nvidia-device-plugin, cloud-spanner, storage-provisioner, registry-creds, storage-provisioner-rancher, ingress-dns, amd-gpu-device-plugin, default-storageclass, yakd, metrics-server, volumesnapshots, registry, ingress, gcp-auth, csi-hostpath-driver
	I1218 00:14:29.462025 1160558 addons.go:530] duration metric: took 1m14.593841548s for enable addons: enabled=[inspektor-gadget nvidia-device-plugin cloud-spanner storage-provisioner registry-creds storage-provisioner-rancher ingress-dns amd-gpu-device-plugin default-storageclass yakd metrics-server volumesnapshots registry ingress gcp-auth csi-hostpath-driver]
	I1218 00:14:29.462092 1160558 start.go:247] waiting for cluster config update ...
	I1218 00:14:29.462120 1160558 start.go:256] writing updated cluster config ...
	I1218 00:14:29.463147 1160558 ssh_runner.go:195] Run: rm -f paused
	I1218 00:14:29.467863 1160558 pod_ready.go:37] extra waiting up to 4m0s for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 00:14:29.471347 1160558 pod_ready.go:83] waiting for pod "coredns-66bc5c9577-clntp" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.477568 1160558 pod_ready.go:94] pod "coredns-66bc5c9577-clntp" is "Ready"
	I1218 00:14:29.477598 1160558 pod_ready.go:86] duration metric: took 6.221873ms for pod "coredns-66bc5c9577-clntp" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.479574 1160558 pod_ready.go:83] waiting for pod "etcd-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.483184 1160558 pod_ready.go:94] pod "etcd-addons-399099" is "Ready"
	I1218 00:14:29.483202 1160558 pod_ready.go:86] duration metric: took 3.605513ms for pod "etcd-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.484906 1160558 pod_ready.go:83] waiting for pod "kube-apiserver-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.488429 1160558 pod_ready.go:94] pod "kube-apiserver-addons-399099" is "Ready"
	I1218 00:14:29.488485 1160558 pod_ready.go:86] duration metric: took 3.554356ms for pod "kube-apiserver-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.490706 1160558 pod_ready.go:83] waiting for pod "kube-controller-manager-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:29.871876 1160558 pod_ready.go:94] pod "kube-controller-manager-addons-399099" is "Ready"
	I1218 00:14:29.871933 1160558 pod_ready.go:86] duration metric: took 381.203313ms for pod "kube-controller-manager-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:30.073311 1160558 pod_ready.go:83] waiting for pod "kube-proxy-7lfkl" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:30.472482 1160558 pod_ready.go:94] pod "kube-proxy-7lfkl" is "Ready"
	I1218 00:14:30.472510 1160558 pod_ready.go:86] duration metric: took 399.167226ms for pod "kube-proxy-7lfkl" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:30.673789 1160558 pod_ready.go:83] waiting for pod "kube-scheduler-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:31.072659 1160558 pod_ready.go:94] pod "kube-scheduler-addons-399099" is "Ready"
	I1218 00:14:31.072687 1160558 pod_ready.go:86] duration metric: took 398.870219ms for pod "kube-scheduler-addons-399099" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 00:14:31.072701 1160558 pod_ready.go:40] duration metric: took 1.604802478s for extra waiting for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 00:14:31.137906 1160558 start.go:625] kubectl: 1.33.2, cluster: 1.34.3 (minor skew: 1)
	I1218 00:14:31.141026 1160558 out.go:179] * Done! kubectl is now configured to use "addons-399099" cluster and "default" namespace by default
	
	
	==> CRI-O <==
	Dec 18 00:14:29 addons-399099 crio[827]: time="2025-12-18T00:14:29.047083107Z" level=info msg="Created container 7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac: kube-system/csi-hostpathplugin-5v2nz/csi-snapshotter" id=7c712056-aa57-4d38-a265-7c8bf1395532 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:14:29 addons-399099 crio[827]: time="2025-12-18T00:14:29.048533067Z" level=info msg="Starting container: 7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac" id=88f1673a-f9ef-4525-ac93-61050686a2d4 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 00:14:29 addons-399099 crio[827]: time="2025-12-18T00:14:29.052693506Z" level=info msg="Started container" PID=4938 containerID=7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac description=kube-system/csi-hostpathplugin-5v2nz/csi-snapshotter id=88f1673a-f9ef-4525-ac93-61050686a2d4 name=/runtime.v1.RuntimeService/StartContainer sandboxID=761bfa3c8af023ff2a8cfeaf467205e61ff0603185837a791c0b6b141d295f0a
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.20574385Z" level=info msg="Running pod sandbox: default/busybox/POD" id=01a512e0-abc1-4f11-a081-6e147e24427c name=/runtime.v1.RuntimeService/RunPodSandbox
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.205827351Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.212705047Z" level=info msg="Got pod network &{Name:busybox Namespace:default ID:b459bbe53bb81c2f3e65ab3604ce062a27d365434941b3983a223553b0c6196b UID:a171d19f-dfd3-4558-a4cf-0b54d6fc0c69 NetNS:/var/run/netns/eff7c556-3467-4792-b8b3-f53fcb78efa6 Networks:[{Name:kindnet Ifname:eth0}] RuntimeConfig:map[kindnet:{IP: MAC: PortMappings:[] Bandwidth:<nil> IpRanges:[] CgroupPath: PodAnnotations:0x40014b04a8}] Aliases:map[]}"
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.212877053Z" level=info msg="Adding pod default_busybox to CNI network \"kindnet\" (type=ptp)"
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.227048422Z" level=info msg="Got pod network &{Name:busybox Namespace:default ID:b459bbe53bb81c2f3e65ab3604ce062a27d365434941b3983a223553b0c6196b UID:a171d19f-dfd3-4558-a4cf-0b54d6fc0c69 NetNS:/var/run/netns/eff7c556-3467-4792-b8b3-f53fcb78efa6 Networks:[{Name:kindnet Ifname:eth0}] RuntimeConfig:map[kindnet:{IP: MAC: PortMappings:[] Bandwidth:<nil> IpRanges:[] CgroupPath: PodAnnotations:0x40014b04a8}] Aliases:map[]}"
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.22732938Z" level=info msg="Checking pod default_busybox for CNI network kindnet (type=ptp)"
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.230254186Z" level=info msg="Ran pod sandbox b459bbe53bb81c2f3e65ab3604ce062a27d365434941b3983a223553b0c6196b with infra container: default/busybox/POD" id=01a512e0-abc1-4f11-a081-6e147e24427c name=/runtime.v1.RuntimeService/RunPodSandbox
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.233465842Z" level=info msg="Checking image status: gcr.io/k8s-minikube/busybox:1.28.4-glibc" id=df5a7dfd-9626-4155-aa65-e6b35a5c4ad1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.233618468Z" level=info msg="Image gcr.io/k8s-minikube/busybox:1.28.4-glibc not found" id=df5a7dfd-9626-4155-aa65-e6b35a5c4ad1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.233683319Z" level=info msg="Neither image nor artfiact gcr.io/k8s-minikube/busybox:1.28.4-glibc found" id=df5a7dfd-9626-4155-aa65-e6b35a5c4ad1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.234613511Z" level=info msg="Pulling image: gcr.io/k8s-minikube/busybox:1.28.4-glibc" id=6f76e08f-69f3-4230-a5d9-01f1aca94caa name=/runtime.v1.ImageService/PullImage
	Dec 18 00:14:32 addons-399099 crio[827]: time="2025-12-18T00:14:32.243035277Z" level=info msg="Trying to access \"gcr.io/k8s-minikube/busybox:1.28.4-glibc\""
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.15630649Z" level=info msg="Pulled image: gcr.io/k8s-minikube/busybox@sha256:580b0aa58b210f512f818b7b7ef4f63c803f7a8cd6baf571b1462b79f7b7719e" id=6f76e08f-69f3-4230-a5d9-01f1aca94caa name=/runtime.v1.ImageService/PullImage
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.157131676Z" level=info msg="Checking image status: gcr.io/k8s-minikube/busybox:1.28.4-glibc" id=9e69f4c6-eabb-464c-bca5-b4d1670a5ecb name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.159062498Z" level=info msg="Checking image status: gcr.io/k8s-minikube/busybox:1.28.4-glibc" id=cf98a9be-8663-498c-8647-fa26cf665103 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.165386383Z" level=info msg="Creating container: default/busybox/busybox" id=9b1f33a6-2209-41e2-ace7-8d2616d21941 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.165492373Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.177519495Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.178015585Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.195012145Z" level=info msg="Created container a475cdf1ea4c5fcb49cf311e576e2d74091b6d6a0577bb205901ca8855fdb388: default/busybox/busybox" id=9b1f33a6-2209-41e2-ace7-8d2616d21941 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.196035388Z" level=info msg="Starting container: a475cdf1ea4c5fcb49cf311e576e2d74091b6d6a0577bb205901ca8855fdb388" id=bcf5451a-3b5b-4a77-92bc-8a5a668c078d name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 00:14:34 addons-399099 crio[827]: time="2025-12-18T00:14:34.198611857Z" level=info msg="Started container" PID=5027 containerID=a475cdf1ea4c5fcb49cf311e576e2d74091b6d6a0577bb205901ca8855fdb388 description=default/busybox/busybox id=bcf5451a-3b5b-4a77-92bc-8a5a668c078d name=/runtime.v1.RuntimeService/StartContainer sandboxID=b459bbe53bb81c2f3e65ab3604ce062a27d365434941b3983a223553b0c6196b
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                                                        CREATED              STATE               NAME                                     ATTEMPT             POD ID              POD                                         NAMESPACE
	a475cdf1ea4c5       gcr.io/k8s-minikube/busybox@sha256:580b0aa58b210f512f818b7b7ef4f63c803f7a8cd6baf571b1462b79f7b7719e                                          8 seconds ago        Running             busybox                                  0                   b459bbe53bb81       busybox                                     default
	7b50af57b1e25       registry.k8s.io/sig-storage/csi-snapshotter@sha256:bd6b8417b2a83e66ab1d4c1193bb2774f027745bdebbd9e0c1a6518afdecc39a                          13 seconds ago       Running             csi-snapshotter                          0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	1d0c5089b631a       registry.k8s.io/sig-storage/csi-provisioner@sha256:98ffd09c0784203d200e0f8c241501de31c8df79644caac7eed61bd6391e5d49                          14 seconds ago       Running             csi-provisioner                          0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	479b1b4e720d7       registry.k8s.io/sig-storage/livenessprobe@sha256:8b00c6e8f52639ed9c6f866085893ab688e57879741b3089e3cfa9998502e158                            16 seconds ago       Running             liveness-probe                           0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	af76677047f8e       registry.k8s.io/sig-storage/hostpathplugin@sha256:7b1dfc90a367222067fc468442fdf952e20fc5961f25c1ad654300ddc34d7083                           17 seconds ago       Running             hostpath                                 0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	4016d25b4ea6b       gcr.io/k8s-minikube/gcp-auth-webhook@sha256:2de98fa4b397f92e5e8e05d73caf21787a1c72c41378f3eb7bad72b1e0f4e9ff                                 18 seconds ago       Running             gcp-auth                                 0                   8047cb3ffdd9b       gcp-auth-78565c9fb4-jgkjz                   gcp-auth
	1530a02bed267       e8105550077f5c6c8e92536651451107053f0e41635396ee42aef596441c179a                                                                             21 seconds ago       Exited              patch                                    3                   f215ee67f9659       gcp-auth-certs-patch-6ql2q                  gcp-auth
	fc97664c5e67e       registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:511b8c8ac828194a753909d26555ff08bc12f497dd8daeb83fe9d593693a26c1                22 seconds ago       Running             node-driver-registrar                    0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	2ab50f9580894       registry.k8s.io/ingress-nginx/controller@sha256:75494e2145fbebf362d24e24e9285b7fbb7da8783ab272092e3126e24ee4776d                             23 seconds ago       Running             controller                               0                   24cdcae5fadc0       ingress-nginx-controller-85d4c799dd-hgt9x   ingress-nginx
	5a8d1e1ff0940       ghcr.io/inspektor-gadget/inspektor-gadget@sha256:fadc7bf59b69965b6707edb68022bed4f55a1f99b15f7acd272793e48f171496                            30 seconds ago       Running             gadget                                   0                   1f73995292cb1       gadget-rc6dl                                gadget
	0da3f638ac932       docker.io/marcnuri/yakd@sha256:0b7e831df7fe4ad1c8c56a736a8d66bd86e243f6777d3c512ead47199d8fbe1a                                              33 seconds ago       Running             yakd                                     0                   9d59b762d031d       yakd-dashboard-6654c87f9b-rbxvh             yakd-dashboard
	6820a02fa77b5       gcr.io/k8s-minikube/kube-registry-proxy@sha256:26c84a64530a67aa4d749dd4356d67ea27a2576e4d25b640d21857b0574cfd4b                              37 seconds ago       Running             registry-proxy                           0                   108a5fb8aa94d       registry-proxy-p5q9s                        kube-system
	ba89b430a4884       registry.k8s.io/sig-storage/snapshot-controller@sha256:5d668e35c15df6e87e2530da25d557f543182cedbdb39d421b87076463ee9857                      40 seconds ago       Running             volume-snapshot-controller               0                   1aa98e2aa8444       snapshot-controller-7d9fbc56b8-kzs8c        kube-system
	33876a37e66b9       nvcr.io/nvidia/k8s-device-plugin@sha256:10b7b747520ba2314061b5b319d3b2766b9cec1fd9404109c607e85b30af6905                                     40 seconds ago       Running             nvidia-device-plugin-ctr                 0                   0f78198a04813       nvidia-device-plugin-daemonset-d4dsb        kube-system
	6e71edf4ac25c       docker.io/library/registry@sha256:8715992817b2254fe61e74ffc6a4096d57a0cde36c95ea075676c05f7a94a630                                           45 seconds ago       Running             registry                                 0                   11ec0792611a6       registry-6b586f9694-k4nhf                   kube-system
	1678bbcf03188       docker.io/rancher/local-path-provisioner@sha256:689a2489a24e74426e4a4666e611c988202c5fa995908b0c60133aca3eb87d98                             46 seconds ago       Running             local-path-provisioner                   0                   4c7096b33ef9f       local-path-provisioner-648f6765c9-xht7b     local-path-storage
	efe15f55db413       registry.k8s.io/sig-storage/snapshot-controller@sha256:5d668e35c15df6e87e2530da25d557f543182cedbdb39d421b87076463ee9857                      47 seconds ago       Running             volume-snapshot-controller               0                   561754ea61f80       snapshot-controller-7d9fbc56b8-knbsf        kube-system
	0f267b721e3e3       registry.k8s.io/sig-storage/csi-external-health-monitor-controller@sha256:8b9df00898ded1bfb4d8f3672679f29cd9f88e651b76fef64121c8d347dd12c0   49 seconds ago       Running             csi-external-health-monitor-controller   0                   761bfa3c8af02       csi-hostpathplugin-5v2nz                    kube-system
	aee091757fd86       registry.k8s.io/ingress-nginx/kube-webhook-certgen@sha256:c9c1ef89e4bb9d6c9c6c0b5375c3253a0b951e5b731240be20cebe5593de142d                   50 seconds ago       Exited              create                                   0                   97f4a6a1c6186       gcp-auth-certs-create-6lrw7                 gcp-auth
	6a785405de376       docker.io/kicbase/minikube-ingress-dns@sha256:6d710af680d8a9b5a5b1f9047eb83ee4c9258efd3fcd962f938c00bcbb4c5958                               51 seconds ago       Running             minikube-ingress-dns                     0                   c061c9262ab31       kube-ingress-dns-minikube                   kube-system
	715597a141e6d       e8105550077f5c6c8e92536651451107053f0e41635396ee42aef596441c179a                                                                             51 seconds ago       Exited              patch                                    2                   249592722b5c1       ingress-nginx-admission-patch-b9f7w         ingress-nginx
	ec3f883321902       registry.k8s.io/sig-storage/csi-attacher@sha256:4b5609c78455de45821910065281a368d5f760b41250f90cbde5110543bdc326                             About a minute ago   Running             csi-attacher                             0                   8e9004760be49       csi-hostpath-attacher-0                     kube-system
	1ebed049ac583       registry.k8s.io/ingress-nginx/kube-webhook-certgen@sha256:c9c1ef89e4bb9d6c9c6c0b5375c3253a0b951e5b731240be20cebe5593de142d                   About a minute ago   Exited              create                                   0                   5fa335e974b4f       ingress-nginx-admission-create-8kc27        ingress-nginx
	fef85e094a52c       registry.k8s.io/sig-storage/csi-resizer@sha256:82c1945463342884c05a5b2bc31319712ce75b154c279c2a10765f61e0f688af                              About a minute ago   Running             csi-resizer                              0                   22bf487459011       csi-hostpath-resizer-0                      kube-system
	ad634b2fc34b9       gcr.io/cloud-spanner-emulator/emulator@sha256:daeab9cb1978e02113045625e2633619f465f22aac7638101995f4cd03607170                               About a minute ago   Running             cloud-spanner-emulator                   0                   8137998debf9b       cloud-spanner-emulator-5bdddb765-vw8bw      default
	c220c6e5aa941       registry.k8s.io/metrics-server/metrics-server@sha256:8f49cf1b0688bb0eae18437882dbf6de2c7a2baac71b1492bc4eca25439a1bf2                        About a minute ago   Running             metrics-server                           0                   58ef667b2f113       metrics-server-85b7d694d7-b7rjb             kube-system
	6b35840df9ffc       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc                                                                             About a minute ago   Running             coredns                                  0                   cbdbd6ad2d4d8       coredns-66bc5c9577-clntp                    kube-system
	63ec289de4c73       ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6                                                                             About a minute ago   Running             storage-provisioner                      0                   1fd9516c640fa       storage-provisioner                         kube-system
	59f8baffb7c55       docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3                                           About a minute ago   Running             kindnet-cni                              0                   ed25cae18d206       kindnet-gxdvh                               kube-system
	4f3b59d54925a       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162                                                                             About a minute ago   Running             kube-proxy                               0                   9dff0d23f092e       kube-proxy-7lfkl                            kube-system
	b53c70f983ca4       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42                                                                             About a minute ago   Running             etcd                                     0                   ebe8cef45c851       etcd-addons-399099                          kube-system
	6cf23e9a9796c       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22                                                                             About a minute ago   Running             kube-controller-manager                  0                   01dd6d80f137f       kube-controller-manager-addons-399099       kube-system
	c3a1af0798174       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896                                                                             About a minute ago   Running             kube-apiserver                           0                   0a1691408e280       kube-apiserver-addons-399099                kube-system
	067ca66f2fd9c       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6                                                                             About a minute ago   Running             kube-scheduler                           0                   e7e713568af7b       kube-scheduler-addons-399099                kube-system
	
	
	==> coredns [6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c] <==
	[INFO] 10.244.0.17:44013 - 3524 "AAAA IN registry.kube-system.svc.cluster.local.cluster.local. udp 81 false 1232" NXDOMAIN qr,aa,rd 163 0.000146866s
	[INFO] 10.244.0.17:44013 - 12862 "A IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 94 false 1232" NXDOMAIN qr,rd,ra 83 0.002167785s
	[INFO] 10.244.0.17:44013 - 48017 "AAAA IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 94 false 1232" NXDOMAIN qr,rd,ra 83 0.001837262s
	[INFO] 10.244.0.17:44013 - 3621 "AAAA IN registry.kube-system.svc.cluster.local. udp 67 false 1232" NOERROR qr,aa,rd 149 0.000542734s
	[INFO] 10.244.0.17:44013 - 41331 "A IN registry.kube-system.svc.cluster.local. udp 67 false 1232" NOERROR qr,aa,rd 110 0.000548863s
	[INFO] 10.244.0.17:39118 - 5435 "AAAA IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000155908s
	[INFO] 10.244.0.17:39118 - 5221 "A IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000272368s
	[INFO] 10.244.0.17:40435 - 43143 "AAAA IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.000111659s
	[INFO] 10.244.0.17:40435 - 42938 "A IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.00021043s
	[INFO] 10.244.0.17:51322 - 56623 "AAAA IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000133115s
	[INFO] 10.244.0.17:51322 - 56451 "A IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000153053s
	[INFO] 10.244.0.17:49830 - 2502 "A IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 83 false 512" NXDOMAIN qr,rd,ra 83 0.001223072s
	[INFO] 10.244.0.17:49830 - 2715 "AAAA IN registry.kube-system.svc.cluster.local.us-east-2.compute.internal. udp 83 false 512" NXDOMAIN qr,rd,ra 83 0.00130288s
	[INFO] 10.244.0.17:56035 - 13130 "AAAA IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 149 0.000121735s
	[INFO] 10.244.0.17:56035 - 12949 "A IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 110 0.000070365s
	[INFO] 10.244.0.21:59226 - 37541 "AAAA IN storage.googleapis.com.gcp-auth.svc.cluster.local. udp 78 false 1232" NXDOMAIN qr,aa,rd 160 0.000175181s
	[INFO] 10.244.0.21:45830 - 16842 "A IN storage.googleapis.com.gcp-auth.svc.cluster.local. udp 78 false 1232" NXDOMAIN qr,aa,rd 160 0.000154333s
	[INFO] 10.244.0.21:34631 - 65485 "A IN storage.googleapis.com.svc.cluster.local. udp 69 false 1232" NXDOMAIN qr,aa,rd 151 0.000248574s
	[INFO] 10.244.0.21:51657 - 49360 "AAAA IN storage.googleapis.com.svc.cluster.local. udp 69 false 1232" NXDOMAIN qr,aa,rd 151 0.00030842s
	[INFO] 10.244.0.21:53843 - 17208 "A IN storage.googleapis.com.cluster.local. udp 65 false 1232" NXDOMAIN qr,aa,rd 147 0.000160569s
	[INFO] 10.244.0.21:60524 - 36001 "AAAA IN storage.googleapis.com.cluster.local. udp 65 false 1232" NXDOMAIN qr,aa,rd 147 0.000061003s
	[INFO] 10.244.0.21:34029 - 55791 "AAAA IN storage.googleapis.com.us-east-2.compute.internal. udp 78 false 1232" NXDOMAIN qr,rd,ra 67 0.002212027s
	[INFO] 10.244.0.21:43457 - 26220 "A IN storage.googleapis.com.us-east-2.compute.internal. udp 78 false 1232" NXDOMAIN qr,rd,ra 67 0.002967258s
	[INFO] 10.244.0.21:52512 - 5671 "A IN storage.googleapis.com. udp 51 false 1232" NOERROR qr,rd,ra 648 0.002945072s
	[INFO] 10.244.0.21:45038 - 9431 "AAAA IN storage.googleapis.com. udp 51 false 1232" NOERROR qr,rd,ra 240 0.003147362s
	
	
	==> describe nodes <==
	Name:               addons-399099
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=addons-399099
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=2e96f676eb7e96389e85fe0658a4ede4c4ba6924
	                    minikube.k8s.io/name=addons-399099
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_12_18T00_13_10_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	                    topology.hostpath.csi/node=addons-399099
	Annotations:        csi.volume.kubernetes.io/nodeid: {"hostpath.csi.k8s.io":"addons-399099"}
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 18 Dec 2025 00:13:07 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  addons-399099
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 18 Dec 2025 00:14:42 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 18 Dec 2025 00:14:21 +0000   Thu, 18 Dec 2025 00:13:03 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 18 Dec 2025 00:14:21 +0000   Thu, 18 Dec 2025 00:13:03 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 18 Dec 2025 00:14:21 +0000   Thu, 18 Dec 2025 00:13:03 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 18 Dec 2025 00:14:21 +0000   Thu, 18 Dec 2025 00:13:28 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    addons-399099
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	System Info:
	  Machine ID:                 02ff784b806e34735a6e229a69428228
	  System UUID:                001279d3-ffb2-40f8-9d24-edba87ec0224
	  Boot ID:                    57207cc2-434a-4297-a7b8-47b6fa2e7487
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  cri-o://1.34.3
	  Kubelet Version:            v1.34.3
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (26 in total)
	  Namespace                   Name                                         CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                         ------------  ----------  ---------------  -------------  ---
	  default                     busybox                                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         11s
	  default                     cloud-spanner-emulator-5bdddb765-vw8bw       0 (0%)        0 (0%)      0 (0%)           0 (0%)         83s
	  gadget                      gadget-rc6dl                                 0 (0%)        0 (0%)      0 (0%)           0 (0%)         82s
	  gcp-auth                    gcp-auth-78565c9fb4-jgkjz                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         77s
	  ingress-nginx               ingress-nginx-controller-85d4c799dd-hgt9x    100m (5%)     0 (0%)      90Mi (1%)        0 (0%)         80s
	  kube-system                 coredns-66bc5c9577-clntp                     100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     87s
	  kube-system                 csi-hostpath-attacher-0                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         80s
	  kube-system                 csi-hostpath-resizer-0                       0 (0%)        0 (0%)      0 (0%)           0 (0%)         80s
	  kube-system                 csi-hostpathplugin-5v2nz                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         74s
	  kube-system                 etcd-addons-399099                           100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         95s
	  kube-system                 kindnet-gxdvh                                100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      88s
	  kube-system                 kube-apiserver-addons-399099                 250m (12%)    0 (0%)      0 (0%)           0 (0%)         93s
	  kube-system                 kube-controller-manager-addons-399099        200m (10%)    0 (0%)      0 (0%)           0 (0%)         93s
	  kube-system                 kube-ingress-dns-minikube                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         81s
	  kube-system                 kube-proxy-7lfkl                             0 (0%)        0 (0%)      0 (0%)           0 (0%)         88s
	  kube-system                 kube-scheduler-addons-399099                 100m (5%)     0 (0%)      0 (0%)           0 (0%)         93s
	  kube-system                 metrics-server-85b7d694d7-b7rjb              100m (5%)     0 (0%)      200Mi (2%)       0 (0%)         81s
	  kube-system                 nvidia-device-plugin-daemonset-d4dsb         0 (0%)        0 (0%)      0 (0%)           0 (0%)         73s
	  kube-system                 registry-6b586f9694-k4nhf                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         82s
	  kube-system                 registry-creds-764b6fb674-txh6b              0 (0%)        0 (0%)      0 (0%)           0 (0%)         84s
	  kube-system                 registry-proxy-p5q9s                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         74s
	  kube-system                 snapshot-controller-7d9fbc56b8-knbsf         0 (0%)        0 (0%)      0 (0%)           0 (0%)         81s
	  kube-system                 snapshot-controller-7d9fbc56b8-kzs8c         0 (0%)        0 (0%)      0 (0%)           0 (0%)         81s
	  kube-system                 storage-provisioner                          0 (0%)        0 (0%)      0 (0%)           0 (0%)         82s
	  local-path-storage          local-path-provisioner-648f6765c9-xht7b      0 (0%)        0 (0%)      0 (0%)           0 (0%)         82s
	  yakd-dashboard              yakd-dashboard-6654c87f9b-rbxvh              0 (0%)        0 (0%)      128Mi (1%)       256Mi (3%)     81s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                1050m (52%)  100m (5%)
	  memory             638Mi (8%)   476Mi (6%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-1Gi      0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	  hugepages-32Mi     0 (0%)       0 (0%)
	  hugepages-64Ki     0 (0%)       0 (0%)
	Events:
	  Type     Reason                   Age   From             Message
	  ----     ------                   ----  ----             -------
	  Normal   Starting                 86s   kube-proxy       
	  Normal   Starting                 93s   kubelet          Starting kubelet.
	  Warning  CgroupV1                 93s   kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  93s   kubelet          Node addons-399099 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    93s   kubelet          Node addons-399099 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     93s   kubelet          Node addons-399099 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           88s   node-controller  Node addons-399099 event: Registered Node addons-399099 in Controller
	  Normal   NodeReady                74s   kubelet          Node addons-399099 status is now: NodeReady
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762] <==
	{"level":"warn","ts":"2025-12-18T00:13:05.595814Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56198","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.620084Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56220","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.631231Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56238","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.657239Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56254","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.688764Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56280","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.704881Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56300","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.740979Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56310","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.766452Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56336","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.796819Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56346","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.823007Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56364","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.853456Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56392","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.880477Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56422","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.900387Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56438","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.915910Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56454","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.946343Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56462","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.964522Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56474","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:05.982593Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56504","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:06.002322Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56514","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:06.119430Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:56534","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:22.666987Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:37604","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:22.683274Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:37634","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.046928Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45506","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.089224Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45516","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.139488Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45530","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T00:13:44.150437Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:45550","server-name":"","error":"EOF"}
	
	
	==> gcp-auth [4016d25b4ea6bebd5e4f1f120bb8151fc38d4aba4842de30c291845fa041b242] <==
	2025/12/18 00:14:23 GCP Auth Webhook started!
	2025/12/18 00:14:31 Ready to marshal response ...
	2025/12/18 00:14:31 Ready to write response ...
	2025/12/18 00:14:31 Ready to marshal response ...
	2025/12/18 00:14:31 Ready to write response ...
	2025/12/18 00:14:31 Ready to marshal response ...
	2025/12/18 00:14:31 Ready to write response ...
	
	
	==> kernel <==
	 00:14:42 up  6:57,  0 user,  load average: 3.50, 2.14, 2.01
	Linux addons-399099 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4] <==
	I1218 00:13:18.564353       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T00:13:18Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1218 00:13:18.825255       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 00:13:18.825280       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 00:13:18.825289       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 00:13:18.825693       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1218 00:13:19.026302       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1218 00:13:19.026327       1 metrics.go:72] Registering metrics
	I1218 00:13:19.026384       1 controller.go:711] "Syncing nftables rules"
	I1218 00:13:28.756287       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:13:28.756424       1 main.go:301] handling current node
	I1218 00:13:38.756294       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:13:38.756333       1 main.go:301] handling current node
	I1218 00:13:48.753232       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:13:48.753263       1 main.go:301] handling current node
	I1218 00:13:58.753934       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:13:58.753966       1 main.go:301] handling current node
	I1218 00:14:08.753795       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:14:08.753838       1 main.go:301] handling current node
	I1218 00:14:18.755552       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:14:18.755585       1 main.go:301] handling current node
	I1218 00:14:28.756319       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:14:28.756429       1 main.go:301] handling current node
	I1218 00:14:38.757786       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1218 00:14:38.757822       1 main.go:301] handling current node
	
	
	==> kube-apiserver [c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af] <==
	W1218 00:13:22.683288       1 logging.go:55] [core] [Channel #263 SubChannel #264]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	I1218 00:13:25.460179       1 alloc.go:328] "allocated clusterIPs" service="gcp-auth/gcp-auth" clusterIPs={"IPv4":"10.109.186.134"}
	W1218 00:13:28.895270       1 dispatcher.go:210] Failed calling webhook, failing open gcp-auth-mutate.k8s.io: failed calling webhook "gcp-auth-mutate.k8s.io": failed to call webhook: Post "https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s": dial tcp 10.109.186.134:443: connect: connection refused
	E1218 00:13:28.895320       1 dispatcher.go:214] "Unhandled Error" err="failed calling webhook \"gcp-auth-mutate.k8s.io\": failed to call webhook: Post \"https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s\": dial tcp 10.109.186.134:443: connect: connection refused" logger="UnhandledError"
	W1218 00:13:28.895285       1 dispatcher.go:210] Failed calling webhook, failing open gcp-auth-mutate.k8s.io: failed calling webhook "gcp-auth-mutate.k8s.io": failed to call webhook: Post "https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s": dial tcp 10.109.186.134:443: connect: connection refused
	E1218 00:13:28.895876       1 dispatcher.go:214] "Unhandled Error" err="failed calling webhook \"gcp-auth-mutate.k8s.io\": failed to call webhook: Post \"https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s\": dial tcp 10.109.186.134:443: connect: connection refused" logger="UnhandledError"
	W1218 00:13:29.009990       1 dispatcher.go:210] Failed calling webhook, failing open gcp-auth-mutate.k8s.io: failed calling webhook "gcp-auth-mutate.k8s.io": failed to call webhook: Post "https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s": dial tcp 10.109.186.134:443: connect: connection refused
	E1218 00:13:29.010087       1 dispatcher.go:214] "Unhandled Error" err="failed calling webhook \"gcp-auth-mutate.k8s.io\": failed to call webhook: Post \"https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s\": dial tcp 10.109.186.134:443: connect: connection refused" logger="UnhandledError"
	W1218 00:13:44.040480       1 logging.go:55] [core] [Channel #270 SubChannel #271]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1218 00:13:44.080400       1 logging.go:55] [core] [Channel #274 SubChannel #275]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1218 00:13:44.113340       1 logging.go:55] [core] [Channel #278 SubChannel #279]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W1218 00:13:44.143187       1 logging.go:55] [core] [Channel #282 SubChannel #283]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1218 00:13:45.009582       1 handler_proxy.go:99] no RequestInfo found in the context
	E1218 00:13:45.009685       1 controller.go:146] "Unhandled Error" err=<
		Error updating APIService "v1beta1.metrics.k8s.io" with err: failed to download v1beta1.metrics.k8s.io: failed to retrieve openAPI spec, http error: ResponseCode: 503, Body: service unavailable
		, Header: map[Content-Type:[text/plain; charset=utf-8] X-Content-Type-Options:[nosniff]]
	 > logger="UnhandledError"
	E1218 00:13:45.010463       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.013895       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.044213       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.078428       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	E1218 00:13:45.119428       1 remote_available_controller.go:462] "Unhandled Error" err="v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1: Get \"https://10.103.158.111:443/apis/metrics.k8s.io/v1beta1\": dial tcp 10.103.158.111:443: connect: connection refused" logger="UnhandledError"
	I1218 00:13:45.352108       1 handler.go:285] Adding GroupVersion metrics.k8s.io v1beta1 to ResourceManager
	E1218 00:14:40.215480       1 conn.go:339] Error on socket receive: read tcp 192.168.49.2:8443->192.168.49.1:51234: use of closed network connection
	E1218 00:14:40.432396       1 conn.go:339] Error on socket receive: read tcp 192.168.49.2:8443->192.168.49.1:51266: use of closed network connection
	
	
	==> kube-controller-manager [6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4] <==
	I1218 00:13:14.051193       1 shared_informer.go:356] "Caches are synced" controller="attach detach"
	I1218 00:13:14.058655       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1218 00:13:14.062518       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrapproving"
	I1218 00:13:14.062618       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 00:13:14.062653       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1218 00:13:14.062682       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1218 00:13:14.064556       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1218 00:13:14.064684       1 shared_informer.go:356] "Caches are synced" controller="expand"
	I1218 00:13:14.064927       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1218 00:13:14.065074       1 shared_informer.go:356] "Caches are synced" controller="deployment"
	I1218 00:13:14.065309       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1218 00:13:14.065960       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1218 00:13:14.067087       1 shared_informer.go:356] "Caches are synced" controller="PV protection"
	I1218 00:13:14.067479       1 shared_informer.go:356] "Caches are synced" controller="ReplicationController"
	I1218 00:13:14.067508       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1218 00:13:14.068573       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1218 00:13:14.069797       1 shared_informer.go:356] "Caches are synced" controller="validatingadmissionpolicy-status"
	I1218 00:13:29.283416       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	E1218 00:13:44.033325       1 resource_quota_controller.go:446] "Unhandled Error" err="unable to retrieve the complete list of server APIs: metrics.k8s.io/v1beta1: stale GroupVersion discovery: metrics.k8s.io/v1beta1" logger="UnhandledError"
	I1218 00:13:44.033465       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="volumesnapshots.snapshot.storage.k8s.io"
	I1218 00:13:44.033522       1 shared_informer.go:349] "Waiting for caches to sync" controller="resource quota"
	I1218 00:13:44.068337       1 garbagecollector.go:787] "failed to discover some groups" logger="garbage-collector-controller" groups="map[\"metrics.k8s.io/v1beta1\":\"stale GroupVersion discovery: metrics.k8s.io/v1beta1\"]"
	I1218 00:13:44.077137       1 shared_informer.go:349] "Waiting for caches to sync" controller="garbage collector"
	I1218 00:13:44.136336       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1218 00:13:44.177307       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	
	
	==> kube-proxy [4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98] <==
	I1218 00:13:15.996913       1 server_linux.go:53] "Using iptables proxy"
	I1218 00:13:16.066401       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1218 00:13:16.169334       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1218 00:13:16.169367       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1218 00:13:16.169434       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1218 00:13:16.221108       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1218 00:13:16.221158       1 server_linux.go:132] "Using iptables Proxier"
	I1218 00:13:16.232881       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1218 00:13:16.233195       1 server.go:527] "Version info" version="v1.34.3"
	I1218 00:13:16.233209       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 00:13:16.254630       1 config.go:200] "Starting service config controller"
	I1218 00:13:16.254652       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1218 00:13:16.254670       1 config.go:106] "Starting endpoint slice config controller"
	I1218 00:13:16.254676       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1218 00:13:16.254687       1 config.go:403] "Starting serviceCIDR config controller"
	I1218 00:13:16.254691       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1218 00:13:16.255417       1 config.go:309] "Starting node config controller"
	I1218 00:13:16.255425       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1218 00:13:16.255431       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1218 00:13:16.355642       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1218 00:13:16.355678       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1218 00:13:16.355732       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6] <==
	I1218 00:13:07.407170       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	I1218 00:13:07.407421       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 00:13:07.407481       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:13:07.413728       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E1218 00:13:07.414588       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 00:13:07.424370       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:13:07.425141       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:13:07.425337       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1218 00:13:07.425393       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:13:07.425467       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 00:13:07.425504       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:13:07.425536       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:13:07.425570       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:13:07.425602       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:13:07.425635       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 00:13:07.425685       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1218 00:13:07.426434       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:13:07.426518       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: resourceslices.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceslices\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1218 00:13:07.426590       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:13:07.426695       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:13:07.426889       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:13:07.427008       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 00:13:07.427070       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 00:13:08.401665       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1218 00:13:11.514695       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Dec 18 00:14:03 addons-399099 kubelet[1278]: I1218 00:14:03.132034    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/nvidia-device-plugin-daemonset-d4dsb" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:14:06 addons-399099 kubelet[1278]: I1218 00:14:06.147555    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-proxy-p5q9s" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:14:06 addons-399099 kubelet[1278]: I1218 00:14:06.168401    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/snapshot-controller-7d9fbc56b8-kzs8c" podStartSLOduration=13.29810559 podStartE2EDuration="45.168383469s" podCreationTimestamp="2025-12-18 00:13:21 +0000 UTC" firstStartedPulling="2025-12-18 00:13:30.07687183 +0000 UTC m=+20.554485259" lastFinishedPulling="2025-12-18 00:14:01.947149709 +0000 UTC m=+52.424763138" observedRunningTime="2025-12-18 00:14:02.15579067 +0000 UTC m=+52.633404099" watchObservedRunningTime="2025-12-18 00:14:06.168383469 +0000 UTC m=+56.645996897"
	Dec 18 00:14:06 addons-399099 kubelet[1278]: I1218 00:14:06.169061    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/registry-proxy-p5q9s" podStartSLOduration=2.894496943 podStartE2EDuration="38.169050662s" podCreationTimestamp="2025-12-18 00:13:28 +0000 UTC" firstStartedPulling="2025-12-18 00:13:30.080677868 +0000 UTC m=+20.558291297" lastFinishedPulling="2025-12-18 00:14:05.355231587 +0000 UTC m=+55.832845016" observedRunningTime="2025-12-18 00:14:06.167886714 +0000 UTC m=+56.645500151" watchObservedRunningTime="2025-12-18 00:14:06.169050662 +0000 UTC m=+56.646664099"
	Dec 18 00:14:06 addons-399099 kubelet[1278]: I1218 00:14:06.637754    1278 scope.go:117] "RemoveContainer" containerID="cbae14c7bbaa590980723720ed907a8b495855957bdfa7fef24e652be688da3d"
	Dec 18 00:14:06 addons-399099 kubelet[1278]: E1218 00:14:06.637933    1278 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"patch\" with CrashLoopBackOff: \"back-off 20s restarting failed container=patch pod=gcp-auth-certs-patch-6ql2q_gcp-auth(09c9cf55-376a-40b5-ad1d-7fea7b1bd261)\"" pod="gcp-auth/gcp-auth-certs-patch-6ql2q" podUID="09c9cf55-376a-40b5-ad1d-7fea7b1bd261"
	Dec 18 00:14:07 addons-399099 kubelet[1278]: I1218 00:14:07.151493    1278 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="kube-system/registry-proxy-p5q9s" secret="" err="secret \"gcp-auth\" not found"
	Dec 18 00:14:09 addons-399099 kubelet[1278]: I1218 00:14:09.174258    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="yakd-dashboard/yakd-dashboard-6654c87f9b-rbxvh" podStartSLOduration=9.384577583 podStartE2EDuration="48.174240398s" podCreationTimestamp="2025-12-18 00:13:21 +0000 UTC" firstStartedPulling="2025-12-18 00:13:30.080731667 +0000 UTC m=+20.558345096" lastFinishedPulling="2025-12-18 00:14:08.870394457 +0000 UTC m=+59.348007911" observedRunningTime="2025-12-18 00:14:09.17379417 +0000 UTC m=+59.651407639" watchObservedRunningTime="2025-12-18 00:14:09.174240398 +0000 UTC m=+59.651853835"
	Dec 18 00:14:13 addons-399099 kubelet[1278]: I1218 00:14:13.245419    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="gadget/gadget-rc6dl" podStartSLOduration=17.79515453 podStartE2EDuration="53.245401208s" podCreationTimestamp="2025-12-18 00:13:20 +0000 UTC" firstStartedPulling="2025-12-18 00:13:36.782766581 +0000 UTC m=+27.260380010" lastFinishedPulling="2025-12-18 00:14:12.233013259 +0000 UTC m=+62.710626688" observedRunningTime="2025-12-18 00:14:13.244586631 +0000 UTC m=+63.722200060" watchObservedRunningTime="2025-12-18 00:14:13.245401208 +0000 UTC m=+63.723014645"
	Dec 18 00:14:20 addons-399099 kubelet[1278]: I1218 00:14:20.253059    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="ingress-nginx/ingress-nginx-controller-85d4c799dd-hgt9x" podStartSLOduration=24.327453364 podStartE2EDuration="58.253042718s" podCreationTimestamp="2025-12-18 00:13:22 +0000 UTC" firstStartedPulling="2025-12-18 00:13:45.293590782 +0000 UTC m=+35.771204326" lastFinishedPulling="2025-12-18 00:14:19.219180251 +0000 UTC m=+69.696793680" observedRunningTime="2025-12-18 00:14:20.252634732 +0000 UTC m=+70.730248161" watchObservedRunningTime="2025-12-18 00:14:20.253042718 +0000 UTC m=+70.730656155"
	Dec 18 00:14:20 addons-399099 kubelet[1278]: I1218 00:14:20.638129    1278 scope.go:117] "RemoveContainer" containerID="cbae14c7bbaa590980723720ed907a8b495855957bdfa7fef24e652be688da3d"
	Dec 18 00:14:21 addons-399099 kubelet[1278]: I1218 00:14:21.239537    1278 scope.go:117] "RemoveContainer" containerID="cbae14c7bbaa590980723720ed907a8b495855957bdfa7fef24e652be688da3d"
	Dec 18 00:14:22 addons-399099 kubelet[1278]: I1218 00:14:22.808191    1278 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qfw\" (UniqueName: \"kubernetes.io/projected/09c9cf55-376a-40b5-ad1d-7fea7b1bd261-kube-api-access-f6qfw\") pod \"09c9cf55-376a-40b5-ad1d-7fea7b1bd261\" (UID: \"09c9cf55-376a-40b5-ad1d-7fea7b1bd261\") "
	Dec 18 00:14:22 addons-399099 kubelet[1278]: I1218 00:14:22.811690    1278 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c9cf55-376a-40b5-ad1d-7fea7b1bd261-kube-api-access-f6qfw" (OuterVolumeSpecName: "kube-api-access-f6qfw") pod "09c9cf55-376a-40b5-ad1d-7fea7b1bd261" (UID: "09c9cf55-376a-40b5-ad1d-7fea7b1bd261"). InnerVolumeSpecName "kube-api-access-f6qfw". PluginName "kubernetes.io/projected", VolumeGIDValue ""
	Dec 18 00:14:22 addons-399099 kubelet[1278]: I1218 00:14:22.909293    1278 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6qfw\" (UniqueName: \"kubernetes.io/projected/09c9cf55-376a-40b5-ad1d-7fea7b1bd261-kube-api-access-f6qfw\") on node \"addons-399099\" DevicePath \"\""
	Dec 18 00:14:23 addons-399099 kubelet[1278]: I1218 00:14:23.252315    1278 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f215ee67f9659b5a9d4252fc93f5605d95b5c2f552982ad259380e17506bf44b"
	Dec 18 00:14:24 addons-399099 kubelet[1278]: I1218 00:14:24.285009    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="gcp-auth/gcp-auth-78565c9fb4-jgkjz" podStartSLOduration=37.098854421 podStartE2EDuration="59.284978563s" podCreationTimestamp="2025-12-18 00:13:25 +0000 UTC" firstStartedPulling="2025-12-18 00:14:01.475766689 +0000 UTC m=+51.953380118" lastFinishedPulling="2025-12-18 00:14:23.661890823 +0000 UTC m=+74.139504260" observedRunningTime="2025-12-18 00:14:24.28489329 +0000 UTC m=+74.762506735" watchObservedRunningTime="2025-12-18 00:14:24.284978563 +0000 UTC m=+74.762592000"
	Dec 18 00:14:25 addons-399099 kubelet[1278]: I1218 00:14:25.646904    1278 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3fc767-7514-4e96-b9bc-fae1c01dfe35" path="/var/lib/kubelet/pods/db3fc767-7514-4e96-b9bc-fae1c01dfe35/volumes"
	Dec 18 00:14:25 addons-399099 kubelet[1278]: I1218 00:14:25.809418    1278 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: hostpath.csi.k8s.io endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0
	Dec 18 00:14:25 addons-399099 kubelet[1278]: I1218 00:14:25.809609    1278 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: hostpath.csi.k8s.io at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock
	Dec 18 00:14:29 addons-399099 kubelet[1278]: I1218 00:14:29.311316    1278 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/csi-hostpathplugin-5v2nz" podStartSLOduration=2.222304437 podStartE2EDuration="1m1.311298761s" podCreationTimestamp="2025-12-18 00:13:28 +0000 UTC" firstStartedPulling="2025-12-18 00:13:29.920548611 +0000 UTC m=+20.398162040" lastFinishedPulling="2025-12-18 00:14:29.009542935 +0000 UTC m=+79.487156364" observedRunningTime="2025-12-18 00:14:29.309764085 +0000 UTC m=+79.787377522" watchObservedRunningTime="2025-12-18 00:14:29.311298761 +0000 UTC m=+79.788912198"
	Dec 18 00:14:31 addons-399099 kubelet[1278]: I1218 00:14:31.993038    1278 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmpkj\" (UniqueName: \"kubernetes.io/projected/a171d19f-dfd3-4558-a4cf-0b54d6fc0c69-kube-api-access-dmpkj\") pod \"busybox\" (UID: \"a171d19f-dfd3-4558-a4cf-0b54d6fc0c69\") " pod="default/busybox"
	Dec 18 00:14:31 addons-399099 kubelet[1278]: I1218 00:14:31.993119    1278 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"gcp-creds\" (UniqueName: \"kubernetes.io/host-path/a171d19f-dfd3-4558-a4cf-0b54d6fc0c69-gcp-creds\") pod \"busybox\" (UID: \"a171d19f-dfd3-4558-a4cf-0b54d6fc0c69\") " pod="default/busybox"
	Dec 18 00:14:32 addons-399099 kubelet[1278]: E1218 00:14:32.799289    1278 secret.go:189] Couldn't get secret kube-system/registry-creds-gcr: secret "registry-creds-gcr" not found
	Dec 18 00:14:32 addons-399099 kubelet[1278]: E1218 00:14:32.799391    1278 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df78a81d-bc8f-4646-b2bc-3b30c7fe0f44-gcr-creds podName:df78a81d-bc8f-4646-b2bc-3b30c7fe0f44 nodeName:}" failed. No retries permitted until 2025-12-18 00:15:36.79937007 +0000 UTC m=+147.276983515 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "gcr-creds" (UniqueName: "kubernetes.io/secret/df78a81d-bc8f-4646-b2bc-3b30c7fe0f44-gcr-creds") pod "registry-creds-764b6fb674-txh6b" (UID: "df78a81d-bc8f-4646-b2bc-3b30c7fe0f44") : secret "registry-creds-gcr" not found
	
	
	==> storage-provisioner [63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67] <==
	W1218 00:14:17.863471       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:19.867328       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:19.873884       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:21.877489       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:21.885371       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:23.888277       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:23.895264       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:25.897707       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:25.905073       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:27.907711       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:27.912464       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:29.915773       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:29.920085       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:31.941883       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:31.946590       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:33.949934       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:33.956944       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:35.959958       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:35.964936       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:37.968375       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:37.974981       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:39.978587       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:39.983655       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:41.986562       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1218 00:14:41.991857       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p addons-399099 -n addons-399099
helpers_test.go:270: (dbg) Run:  kubectl --context addons-399099 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:281: non-running pods: gcp-auth-certs-patch-6ql2q ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w registry-creds-764b6fb674-txh6b
helpers_test.go:283: ======> post-mortem[TestAddons/parallel/Headlamp]: describe non-running pods <======
helpers_test.go:286: (dbg) Run:  kubectl --context addons-399099 describe pod gcp-auth-certs-patch-6ql2q ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w registry-creds-764b6fb674-txh6b
helpers_test.go:286: (dbg) Non-zero exit: kubectl --context addons-399099 describe pod gcp-auth-certs-patch-6ql2q ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w registry-creds-764b6fb674-txh6b: exit status 1 (92.207491ms)

                                                
                                                
** stderr ** 
	Error from server (NotFound): pods "gcp-auth-certs-patch-6ql2q" not found
	Error from server (NotFound): pods "ingress-nginx-admission-create-8kc27" not found
	Error from server (NotFound): pods "ingress-nginx-admission-patch-b9f7w" not found
	Error from server (NotFound): pods "registry-creds-764b6fb674-txh6b" not found

                                                
                                                
** /stderr **
helpers_test.go:288: kubectl --context addons-399099 describe pod gcp-auth-certs-patch-6ql2q ingress-nginx-admission-create-8kc27 ingress-nginx-admission-patch-b9f7w registry-creds-764b6fb674-txh6b: exit status 1
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable headlamp --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable headlamp --alsologtostderr -v=1: exit status 11 (280.622576ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:14:44.018259 1167178 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:14:44.019169 1167178 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:44.019222 1167178 out.go:374] Setting ErrFile to fd 2...
	I1218 00:14:44.019245 1167178 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:14:44.019541 1167178 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:14:44.019875 1167178 mustload.go:66] Loading cluster: addons-399099
	I1218 00:14:44.020348 1167178 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:44.020397 1167178 addons.go:622] checking whether the cluster is paused
	I1218 00:14:44.020536 1167178 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:14:44.020571 1167178 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:14:44.021175 1167178 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:14:44.046735 1167178 ssh_runner.go:195] Run: systemctl --version
	I1218 00:14:44.046796 1167178 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:14:44.063833 1167178 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:14:44.174657 1167178 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:14:44.174772 1167178 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:14:44.207282 1167178 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:14:44.207355 1167178 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:14:44.207375 1167178 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:14:44.207394 1167178 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:14:44.207421 1167178 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:14:44.207441 1167178 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:14:44.207460 1167178 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:14:44.207477 1167178 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:14:44.207494 1167178 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:14:44.207514 1167178 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:14:44.207531 1167178 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:14:44.207548 1167178 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:14:44.207565 1167178 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:14:44.207591 1167178 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:14:44.207609 1167178 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:14:44.207629 1167178 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:14:44.207653 1167178 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:14:44.207678 1167178 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:14:44.207697 1167178 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:14:44.207714 1167178 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:14:44.207735 1167178 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:14:44.207759 1167178 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:14:44.207777 1167178 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:14:44.207794 1167178 cri.go:89] found id: ""
	I1218 00:14:44.207860 1167178 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:14:44.222898 1167178 out.go:203] 
	W1218 00:14:44.225682 1167178 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:44Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:14:44Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:14:44.225708 1167178 out.go:285] * 
	* 
	W1218 00:14:44.233599 1167178 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_efe3f0a65eabdab15324ffdebd5a66da17706a9c_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_efe3f0a65eabdab15324ffdebd5a66da17706a9c_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:14:44.236648 1167178 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable headlamp addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable headlamp --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/Headlamp (3.40s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.46s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:842: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:353: "cloud-spanner-emulator-5bdddb765-vw8bw" [f075e1b1-4610-4f7e-ac3b-62dbb808d57c] Running
addons_test.go:842: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.004962417s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable cloud-spanner --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable cloud-spanner --alsologtostderr -v=1: exit status 11 (448.378467ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:16:00.304711 1169158 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:16:00.322634 1169158 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:16:00.322660 1169158 out.go:374] Setting ErrFile to fd 2...
	I1218 00:16:00.322668 1169158 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:16:00.327140 1169158 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:16:00.327609 1169158 mustload.go:66] Loading cluster: addons-399099
	I1218 00:16:00.328548 1169158 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:16:00.328585 1169158 addons.go:622] checking whether the cluster is paused
	I1218 00:16:00.328829 1169158 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:16:00.328865 1169158 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:16:00.329856 1169158 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:16:00.409384 1169158 ssh_runner.go:195] Run: systemctl --version
	I1218 00:16:00.409446 1169158 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:16:00.454077 1169158 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:16:00.567149 1169158 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:16:00.567332 1169158 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:16:00.599311 1169158 cri.go:89] found id: "a8cc12099b33daa7fe96e24eed983163a093fcdb3984195b1638d8c1b93d10f6"
	I1218 00:16:00.599339 1169158 cri.go:89] found id: "6bf6c8b341c6c285e492c277ed41d9902cf8bd73552a43976339fe5b3b36649d"
	I1218 00:16:00.599345 1169158 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:16:00.599349 1169158 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:16:00.599358 1169158 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:16:00.599362 1169158 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:16:00.599366 1169158 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:16:00.599370 1169158 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:16:00.599373 1169158 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:16:00.599379 1169158 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:16:00.599382 1169158 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:16:00.599388 1169158 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:16:00.599392 1169158 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:16:00.599396 1169158 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:16:00.599399 1169158 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:16:00.599405 1169158 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:16:00.599411 1169158 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:16:00.599416 1169158 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:16:00.599419 1169158 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:16:00.599422 1169158 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:16:00.599427 1169158 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:16:00.599430 1169158 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:16:00.599433 1169158 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:16:00.599437 1169158 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:16:00.599440 1169158 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:16:00.599443 1169158 cri.go:89] found id: ""
	I1218 00:16:00.599499 1169158 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:16:00.614935 1169158 out.go:203] 
	W1218 00:16:00.617914 1169158 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:16:00Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:16:00Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:16:00.617940 1169158 out.go:285] * 
	* 
	W1218 00:16:00.625764 1169158 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_e93ff976b7e98e1dc466aded9385c0856b6d1b41_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_e93ff976b7e98e1dc466aded9385c0856b6d1b41_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:16:00.628682 1169158 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable cloud-spanner addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable cloud-spanner --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/CloudSpanner (5.46s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (8.62s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:951: (dbg) Run:  kubectl --context addons-399099 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:957: (dbg) Run:  kubectl --context addons-399099 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:961: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-399099 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:964: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:353: "test-local-path" [2e5b4795-c1c8-46a4-9dff-e45755c17ad4] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:353: "test-local-path" [2e5b4795-c1c8-46a4-9dff-e45755c17ad4] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:353: "test-local-path" [2e5b4795-c1c8-46a4-9dff-e45755c17ad4] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:964: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 3.002758027s
addons_test.go:969: (dbg) Run:  kubectl --context addons-399099 get pvc test-pvc -o=json
addons_test.go:978: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 ssh "cat /opt/local-path-provisioner/pvc-e70fd2c3-45de-49ac-a61e-78ca412545ff_default_test-pvc/file1"
addons_test.go:990: (dbg) Run:  kubectl --context addons-399099 delete pod test-local-path
addons_test.go:994: (dbg) Run:  kubectl --context addons-399099 delete pvc test-pvc
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable storage-provisioner-rancher --alsologtostderr -v=1: exit status 11 (389.336045ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:43.323533 1168870 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:43.327140 1168870 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:43.327164 1168870 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:43.327171 1168870 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:43.327504 1168870 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:43.327893 1168870 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:43.328374 1168870 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:43.328396 1168870 addons.go:622] checking whether the cluster is paused
	I1218 00:15:43.328549 1168870 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:43.328568 1168870 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:43.329130 1168870 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:43.354660 1168870 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:43.354727 1168870 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:43.389872 1168870 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:43.521745 1168870 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:43.521844 1168870 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:43.571646 1168870 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:43.571671 1168870 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:43.571677 1168870 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:43.571680 1168870 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:43.571684 1168870 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:43.571688 1168870 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:43.571691 1168870 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:43.571695 1168870 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:43.571698 1168870 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:43.571707 1168870 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:43.571711 1168870 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:43.571714 1168870 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:43.571717 1168870 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:43.571720 1168870 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:43.571723 1168870 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:43.571735 1168870 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:43.571741 1168870 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:43.571746 1168870 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:43.571749 1168870 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:43.571752 1168870 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:43.571757 1168870 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:43.571763 1168870 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:43.571766 1168870 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:43.571769 1168870 cri.go:89] found id: ""
	I1218 00:15:43.571819 1168870 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:43.592530 1168870 out.go:203] 
	W1218 00:15:43.595717 1168870 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:43Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:43Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:43.595800 1168870 out.go:285] * 
	* 
	W1218 00:15:43.603682 1168870 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_e8b2053d4ef30ba659303f708d034237180eb1ed_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_e8b2053d4ef30ba659303f708d034237180eb1ed_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:43.607458 1168870 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable storage-provisioner-rancher addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable storage-provisioner-rancher --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/LocalPath (8.62s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.27s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1027: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:353: "nvidia-device-plugin-daemonset-d4dsb" [ae85f151-c5c0-42d9-a8ec-b9331a33aa25] Running
addons_test.go:1027: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.005425957s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable nvidia-device-plugin --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable nvidia-device-plugin --alsologtostderr -v=1: exit status 11 (266.823441ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:54.957067 1169088 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:54.957929 1169088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:54.957981 1169088 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:54.958002 1169088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:54.958292 1169088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:54.958615 1169088 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:54.959029 1169088 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:54.959072 1169088 addons.go:622] checking whether the cluster is paused
	I1218 00:15:54.959234 1169088 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:54.959270 1169088 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:54.959777 1169088 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:54.976617 1169088 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:54.976682 1169088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:54.993616 1169088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:55.103222 1169088 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:55.103307 1169088 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:55.138098 1169088 cri.go:89] found id: "6bf6c8b341c6c285e492c277ed41d9902cf8bd73552a43976339fe5b3b36649d"
	I1218 00:15:55.138121 1169088 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:55.138126 1169088 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:55.138130 1169088 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:55.138134 1169088 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:55.138137 1169088 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:55.138140 1169088 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:55.138144 1169088 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:55.138146 1169088 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:55.138153 1169088 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:55.138159 1169088 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:55.138163 1169088 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:55.138166 1169088 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:55.138169 1169088 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:55.138173 1169088 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:55.138180 1169088 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:55.138186 1169088 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:55.138191 1169088 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:55.138194 1169088 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:55.138197 1169088 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:55.138202 1169088 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:55.138210 1169088 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:55.138213 1169088 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:55.138216 1169088 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:55.138219 1169088 cri.go:89] found id: ""
	I1218 00:15:55.138270 1169088 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:55.153398 1169088 out.go:203] 
	W1218 00:15:55.156283 1169088 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:55Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:55Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:55.156303 1169088 out.go:285] * 
	* 
	W1218 00:15:55.164044 1169088 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_47e1a72799625313bd916979b0f8aa84efd54736_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_47e1a72799625313bd916979b0f8aa84efd54736_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:55.166845 1169088 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable nvidia-device-plugin addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable nvidia-device-plugin --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/NvidiaDevicePlugin (5.27s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (6.28s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1049: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:353: "yakd-dashboard-6654c87f9b-rbxvh" [2958f0b5-9664-4571-90d7-dbc73c42ddb3] Running
addons_test.go:1049: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 6.003274959s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-399099 addons disable yakd --alsologtostderr -v=1
addons_test.go:1055: (dbg) Non-zero exit: out/minikube-linux-arm64 -p addons-399099 addons disable yakd --alsologtostderr -v=1: exit status 11 (276.51276ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:15:49.678367 1169027 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:15:49.679122 1169027 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:49.679139 1169027 out.go:374] Setting ErrFile to fd 2...
	I1218 00:15:49.679145 1169027 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:15:49.679412 1169027 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:15:49.679721 1169027 mustload.go:66] Loading cluster: addons-399099
	I1218 00:15:49.680092 1169027 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:49.680111 1169027 addons.go:622] checking whether the cluster is paused
	I1218 00:15:49.680253 1169027 config.go:182] Loaded profile config "addons-399099": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:15:49.680268 1169027 host.go:66] Checking if "addons-399099" exists ...
	I1218 00:15:49.680784 1169027 cli_runner.go:164] Run: docker container inspect addons-399099 --format={{.State.Status}}
	I1218 00:15:49.704018 1169027 ssh_runner.go:195] Run: systemctl --version
	I1218 00:15:49.704076 1169027 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-399099
	I1218 00:15:49.723192 1169027 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33910 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/addons-399099/id_rsa Username:docker}
	I1218 00:15:49.830786 1169027 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:15:49.830868 1169027 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:15:49.866677 1169027 cri.go:89] found id: "6bf6c8b341c6c285e492c277ed41d9902cf8bd73552a43976339fe5b3b36649d"
	I1218 00:15:49.866697 1169027 cri.go:89] found id: "7b50af57b1e2500501e3f5c40c9e5d87d1730b490f70f2a90fa69087944263ac"
	I1218 00:15:49.866702 1169027 cri.go:89] found id: "1d0c5089b631a7f6c79d2a5360fb6562c65fbbed809ed5071df7d26d42813fce"
	I1218 00:15:49.866705 1169027 cri.go:89] found id: "479b1b4e720d7f6c183807d29b6edd2d2b8bdb62c39ca93889dbdaddc43073b3"
	I1218 00:15:49.866709 1169027 cri.go:89] found id: "af76677047f8e953499840da71ef00484cfe059beb3c1982c90e58280a48ba48"
	I1218 00:15:49.866712 1169027 cri.go:89] found id: "fc97664c5e67ed90add31363e3e4e2cfd82b036a5a20449b5b287187b0258560"
	I1218 00:15:49.866716 1169027 cri.go:89] found id: "6820a02fa77b59ecf89529737ff3957fbd6e6bf035105d815212183d467fa778"
	I1218 00:15:49.866719 1169027 cri.go:89] found id: "ba89b430a4884094559c12d33137a5166c2a23e57eff31ba68a98e30730ec2c2"
	I1218 00:15:49.866722 1169027 cri.go:89] found id: "33876a37e66b99db4dd758c4bc552a851092973be2c96ce3c1f26cadc32a91f8"
	I1218 00:15:49.866728 1169027 cri.go:89] found id: "6e71edf4ac25c9b3180082aa2c0795096e79e9d4cc43b735bf4524058ba0533b"
	I1218 00:15:49.866731 1169027 cri.go:89] found id: "efe15f55db4131d0b92b9b78ea2ddc6cadb6cdec1909bad0c80713564586d5a2"
	I1218 00:15:49.866735 1169027 cri.go:89] found id: "0f267b721e3e33c2182fe5b761ff69eecff3d7f75fe84f685bb96607940ac8c5"
	I1218 00:15:49.866738 1169027 cri.go:89] found id: "6a785405de3763c9ab8b8855d5868b56063bad3da6018dbc6e59f6f6042a2ba8"
	I1218 00:15:49.866741 1169027 cri.go:89] found id: "ec3f883321902fb9bf51669bf47281eac533c9dea2c46befe26662376ae6808e"
	I1218 00:15:49.866744 1169027 cri.go:89] found id: "fef85e094a52cc4da72f21e6512ab258ebd483c93d1139de2bbdd807d88c43ac"
	I1218 00:15:49.866751 1169027 cri.go:89] found id: "c220c6e5aa9418c5dbc60135dc7ba6cb89ba9adc948036579347c22fe255fafa"
	I1218 00:15:49.866754 1169027 cri.go:89] found id: "6b35840df9ffc48a9c3e11f7436b38e35aab520c7fd3150b0b9745a6d34e1c1c"
	I1218 00:15:49.866759 1169027 cri.go:89] found id: "63ec289de4c73c0a007e8556cc49402c60c207a63857549ba7fb71ca58dcef67"
	I1218 00:15:49.866762 1169027 cri.go:89] found id: "59f8baffb7c55e9a207a0a101319243b5c8749036e28fdc4c9e16bf25806abf4"
	I1218 00:15:49.866765 1169027 cri.go:89] found id: "4f3b59d54925af7538256f4d0a0b8c6296e527576242d7dda86c302d6de4fc98"
	I1218 00:15:49.866770 1169027 cri.go:89] found id: "b53c70f983ca40ce8fb877b6cfca34ba4f0d2e98dd5241f7d9f70d7d0898b762"
	I1218 00:15:49.866773 1169027 cri.go:89] found id: "6cf23e9a9796cf2aa49503cf765b2c1dfe42f98a18b0c71e96d45e18ba60b8b4"
	I1218 00:15:49.866776 1169027 cri.go:89] found id: "c3a1af07981748bb37813bc7a0974f2f36723f0a27e7db28ddb4b6f9bb1fe7af"
	I1218 00:15:49.866779 1169027 cri.go:89] found id: "067ca66f2fd9c67b418ac8fa1696ddbd98945b96337e777a88aa2657955e34b6"
	I1218 00:15:49.866782 1169027 cri.go:89] found id: ""
	I1218 00:15:49.866842 1169027 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:15:49.881946 1169027 out.go:203] 
	W1218 00:15:49.884888 1169027 out.go:285] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:49Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:15:49Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 00:15:49.884914 1169027 out.go:285] * 
	* 
	W1218 00:15:49.892800 1169027 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_82e5d844def28f20a5cac88dc27578ab5d1e7e1a_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_82e5d844def28f20a5cac88dc27578ab5d1e7e1a_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:15:49.895894 1169027 out.go:203] 

                                                
                                                
** /stderr **
addons_test.go:1057: failed to disable yakd addon: args "out/minikube-linux-arm64 -p addons-399099 addons disable yakd --alsologtostderr -v=1": exit status 11
--- FAIL: TestAddons/parallel/Yakd (6.28s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (464.23s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
I1218 00:19:34.052302 1159552 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-240845 --alsologtostderr -v=8
E1218 00:19:34.601705 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:37.163175 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:42.284829 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:52.526413 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:20:13.007910 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:20:53.970410 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:22:15.892441 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:24:32.022061 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:24:59.734743 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-240845 --alsologtostderr -v=8: exit status 80 (7m40.522818676s)

                                                
                                                
-- stdout --
	* [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-240845" primary control-plane node in "functional-240845" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:19:34.105121 1177669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:19:34.105346 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105377 1177669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:19:34.105397 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105673 1177669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:19:34.106120 1177669 out.go:368] Setting JSON to false
	I1218 00:19:34.107069 1177669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25322,"bootTime":1765991852,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:19:34.107165 1177669 start.go:143] virtualization:  
	I1218 00:19:34.110567 1177669 out.go:179] * [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:19:34.114275 1177669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:19:34.114378 1177669 notify.go:221] Checking for updates...
	I1218 00:19:34.120029 1177669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:19:34.122925 1177669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:19:34.125751 1177669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:19:34.128638 1177669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:19:34.131461 1177669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:19:34.134887 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:34.134985 1177669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:19:34.159427 1177669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:19:34.159542 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.223972 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.214884618 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.224090 1177669 docker.go:319] overlay module found
	I1218 00:19:34.227220 1177669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:19:34.229963 1177669 start.go:309] selected driver: docker
	I1218 00:19:34.229985 1177669 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false D
isableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.230103 1177669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:19:34.230199 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.285040 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.2764408 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.285449 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:19:34.285507 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:19:34.285561 1177669 start.go:353] cluster config:
	{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetC
lientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.290381 1177669 out.go:179] * Starting "functional-240845" primary control-plane node in "functional-240845" cluster
	I1218 00:19:34.293210 1177669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:19:34.297960 1177669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:19:34.300783 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:19:34.300829 1177669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:19:34.300855 1177669 cache.go:65] Caching tarball of preloaded images
	I1218 00:19:34.300881 1177669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:19:34.300940 1177669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:19:34.300950 1177669 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:19:34.301056 1177669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/config.json ...
	I1218 00:19:34.320164 1177669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:19:34.320186 1177669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:19:34.320203 1177669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:19:34.320279 1177669 start.go:360] acquireMachinesLock for functional-240845: {Name:mk3ed718f4cde9dd7b19ef8d5bcd86c3175b5067 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:19:34.320350 1177669 start.go:364] duration metric: took 45.89µs to acquireMachinesLock for "functional-240845"
	I1218 00:19:34.320375 1177669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:19:34.320383 1177669 fix.go:54] fixHost starting: 
	I1218 00:19:34.320643 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:19:34.337200 1177669 fix.go:112] recreateIfNeeded on functional-240845: state=Running err=<nil>
	W1218 00:19:34.337231 1177669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:19:34.340534 1177669 out.go:252] * Updating the running docker "functional-240845" container ...
	I1218 00:19:34.340583 1177669 machine.go:94] provisionDockerMachine start ...
	I1218 00:19:34.340661 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.357593 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.357953 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.357966 1177669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:19:34.511862 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.511889 1177669 ubuntu.go:182] provisioning hostname "functional-240845"
	I1218 00:19:34.511951 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.530122 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.530421 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.530437 1177669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-240845 && echo "functional-240845" | sudo tee /etc/hostname
	I1218 00:19:34.693713 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.693796 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.711115 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.711437 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.711457 1177669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-240845' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-240845/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-240845' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:19:34.868676 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:19:34.868704 1177669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:19:34.868727 1177669 ubuntu.go:190] setting up certificates
	I1218 00:19:34.868737 1177669 provision.go:84] configureAuth start
	I1218 00:19:34.868796 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:34.885386 1177669 provision.go:143] copyHostCerts
	I1218 00:19:34.885436 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885473 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:19:34.885484 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885557 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:19:34.885647 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885670 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:19:34.885675 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885701 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:19:34.885784 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885802 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:19:34.885807 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885830 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:19:34.885882 1177669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-240845 san=[127.0.0.1 192.168.49.2 functional-240845 localhost minikube]
	I1218 00:19:35.070465 1177669 provision.go:177] copyRemoteCerts
	I1218 00:19:35.070558 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:19:35.070625 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.089175 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:35.196164 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:19:35.196247 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:19:35.213266 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:19:35.213323 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:19:35.231357 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:19:35.231416 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:19:35.249293 1177669 provision.go:87] duration metric: took 380.542312ms to configureAuth
	I1218 00:19:35.249372 1177669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:19:35.249565 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:35.249673 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.267176 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:35.267503 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:35.267526 1177669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:19:40.661888 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:19:40.661918 1177669 machine.go:97] duration metric: took 6.321326566s to provisionDockerMachine
	I1218 00:19:40.661929 1177669 start.go:293] postStartSetup for "functional-240845" (driver="docker")
	I1218 00:19:40.661947 1177669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:19:40.662006 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:19:40.662069 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.679665 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.787680 1177669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:19:40.790725 1177669 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:19:40.790745 1177669 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:19:40.790750 1177669 command_runner.go:130] > VERSION_ID="12"
	I1218 00:19:40.790757 1177669 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:19:40.790762 1177669 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:19:40.790766 1177669 command_runner.go:130] > ID=debian
	I1218 00:19:40.790771 1177669 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:19:40.790776 1177669 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:19:40.790785 1177669 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:19:40.790821 1177669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:19:40.790843 1177669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:19:40.790853 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:19:40.790906 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:19:40.790988 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:19:40.791003 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:19:40.791081 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:19:40.791089 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:19:40.791141 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:19:40.798177 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:19:40.814786 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:19:40.830892 1177669 start.go:296] duration metric: took 168.948549ms for postStartSetup
	I1218 00:19:40.831030 1177669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:19:40.831082 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.848091 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.952833 1177669 command_runner.go:130] > 13%
	I1218 00:19:40.953354 1177669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:19:40.957853 1177669 command_runner.go:130] > 171G
	I1218 00:19:40.958309 1177669 fix.go:56] duration metric: took 6.637921757s for fixHost
	I1218 00:19:40.958329 1177669 start.go:83] releasing machines lock for "functional-240845", held for 6.637966499s
	I1218 00:19:40.958394 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:40.975843 1177669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:19:40.975911 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.976173 1177669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:19:40.976254 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.995610 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.013560 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.099878 1177669 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:19:41.100025 1177669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:19:41.195326 1177669 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:19:41.198525 1177669 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:19:41.198598 1177669 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:19:41.198697 1177669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:19:41.321255 1177669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:19:41.326138 1177669 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:19:41.326216 1177669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:19:41.326312 1177669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:19:41.337406 1177669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:19:41.337470 1177669 start.go:496] detecting cgroup driver to use...
	I1218 00:19:41.337517 1177669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:19:41.337604 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:19:41.364732 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:19:41.395259 1177669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:19:41.395373 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:19:41.425216 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:19:41.453795 1177669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:19:41.688599 1177669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:19:41.909163 1177669 docker.go:234] disabling docker service ...
	I1218 00:19:41.909312 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:19:41.926883 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:19:41.943387 1177669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:19:42.156451 1177669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:19:42.449825 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:19:42.467750 1177669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:19:42.493864 1177669 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:19:42.495463 1177669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:19:42.495560 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.506971 1177669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:19:42.507118 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.518977 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.530876 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.539925 1177669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:19:42.553447 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.569558 1177669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.582698 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.597525 1177669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:19:42.608606 1177669 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:19:42.609612 1177669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:19:42.617962 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:19:42.846451 1177669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:21:13.130293 1177669 ssh_runner.go:235] Completed: sudo systemctl restart crio: (1m30.283808536s)
	I1218 00:21:13.130318 1177669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:21:13.130368 1177669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:21:13.134416 1177669 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:21:13.134438 1177669 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:21:13.134453 1177669 command_runner.go:130] > Device: 0,72	Inode: 804         Links: 1
	I1218 00:21:13.134460 1177669 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:13.134465 1177669 command_runner.go:130] > Access: 2025-12-18 00:21:13.087402358 +0000
	I1218 00:21:13.134471 1177669 command_runner.go:130] > Modify: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134475 1177669 command_runner.go:130] > Change: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134479 1177669 command_runner.go:130] >  Birth: -
	I1218 00:21:13.134836 1177669 start.go:564] Will wait 60s for crictl version
	I1218 00:21:13.134895 1177669 ssh_runner.go:195] Run: which crictl
	I1218 00:21:13.138647 1177669 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:21:13.138725 1177669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:21:13.167266 1177669 command_runner.go:130] > Version:  0.1.0
	I1218 00:21:13.167284 1177669 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:21:13.167289 1177669 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:21:13.167294 1177669 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:21:13.169251 1177669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:21:13.169347 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.194596 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.194618 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.194624 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.194629 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.194634 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.194639 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.194643 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.194656 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.194660 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.194671 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.194674 1177669 command_runner.go:130] >      static
	I1218 00:21:13.194678 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.194682 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.194686 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.194689 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.194693 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.194697 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.194701 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.194705 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.194709 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.196349 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.221274 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.221297 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.221302 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.221308 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.221313 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.221318 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.221321 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.221326 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.221331 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.221334 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.221338 1177669 command_runner.go:130] >      static
	I1218 00:21:13.221341 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.221345 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.221350 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.221353 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.221357 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.221360 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.221364 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.221369 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.221373 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.226046 1177669 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:21:13.228983 1177669 cli_runner.go:164] Run: docker network inspect functional-240845 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:21:13.244579 1177669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:21:13.248178 1177669 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:21:13.248440 1177669 kubeadm.go:884] updating cluster {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA API
ServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fal
se DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:21:13.248553 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:21:13.248613 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.282229 1177669 command_runner.go:130] > {
	I1218 00:21:13.282251 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.282256 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282265 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.282269 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282275 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.282279 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282283 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282294 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.282305 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.282308 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282313 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.282332 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282342 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282346 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282350 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282356 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.282364 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282370 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.282373 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282378 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282389 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.282398 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.282403 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282408 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.282415 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282422 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282426 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282434 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282444 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.282449 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282454 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.282462 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282466 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282475 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.282483 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.282491 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282495 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.282499 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282503 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282506 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282509 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282516 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.282523 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282528 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.282532 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282536 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282549 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.282557 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.282564 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282569 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.282578 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.282586 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282589 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282592 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282599 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.282606 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282611 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.282615 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282624 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282631 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.282643 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.282647 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282651 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.282658 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282661 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282665 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282669 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282676 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282680 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282698 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282709 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.282714 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282719 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.282726 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282729 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282737 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.282746 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.282751 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282755 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.282759 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282765 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282769 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282777 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282782 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282785 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282788 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282795 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.282802 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282807 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.282811 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282815 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282828 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.282836 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.282843 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282850 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.282853 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282862 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282865 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282869 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282873 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282883 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282887 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282894 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.282902 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282907 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.282910 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282913 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282922 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.282941 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.282952 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282957 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.282967 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282970 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282973 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282976 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282984 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.282999 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283004 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.283007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283010 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283018 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.283026 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.283029 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283036 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.283040 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283046 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.283054 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283061 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283065 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.283067 1177669 command_runner.go:130] >     },
	I1218 00:21:13.283071 1177669 command_runner.go:130] >     {
	I1218 00:21:13.283079 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.283084 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283089 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.283092 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283099 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283107 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.283116 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.283122 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283126 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.283129 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283133 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.283136 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283144 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283148 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.283155 1177669 command_runner.go:130] >     }
	I1218 00:21:13.283158 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.283161 1177669 command_runner.go:130] > }
	I1218 00:21:13.283336 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.283347 1177669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:21:13.283410 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.307800 1177669 command_runner.go:130] > {
	I1218 00:21:13.307819 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.307823 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307831 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.307836 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307841 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.307845 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307849 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307861 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.307869 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.307872 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307877 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.307881 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307886 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307889 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307893 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307899 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.307903 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307909 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.307912 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307921 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307929 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.307940 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.307943 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307947 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.307951 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307959 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307962 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307965 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307971 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.307975 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307980 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.307983 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307987 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307995 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.308003 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.308007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308011 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.308015 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308020 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308023 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308026 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308032 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.308036 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308042 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.308045 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308049 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308057 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.308065 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.308068 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308072 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.308080 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.308084 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308087 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308090 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308099 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.308103 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308108 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.308111 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308114 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308122 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.308129 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.308132 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308136 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.308140 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308143 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308146 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308149 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308153 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308156 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308159 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308165 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.308168 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308173 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.308176 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308180 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308188 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.308195 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.308198 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308202 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.308206 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308210 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308213 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308217 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308241 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308244 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308247 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308253 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.308262 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308269 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.308275 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308279 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308287 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.308295 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.308298 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308302 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.308306 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308309 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308312 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308316 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308319 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308323 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308325 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308332 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.308335 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308340 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.308343 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308347 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308354 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.308370 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.308374 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308377 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.308381 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308385 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308387 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308390 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308397 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.308400 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308405 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.308408 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308412 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308422 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.308430 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.308433 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308437 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.308440 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308444 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308447 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308450 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308455 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308458 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308461 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308468 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.308472 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308477 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.308480 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308484 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308491 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.308498 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.308501 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308505 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.308508 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308512 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.308515 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308518 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308522 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.308524 1177669 command_runner.go:130] >     }
	I1218 00:21:13.308527 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.308529 1177669 command_runner.go:130] > }
	I1218 00:21:13.310403 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.310424 1177669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:21:13.310432 1177669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.34.3 crio true true} ...
	I1218 00:21:13.310536 1177669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-240845 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:21:13.310619 1177669 ssh_runner.go:195] Run: crio config
	I1218 00:21:13.358161 1177669 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:21:13.358186 1177669 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:21:13.358194 1177669 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:21:13.358198 1177669 command_runner.go:130] > #
	I1218 00:21:13.358205 1177669 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:21:13.358212 1177669 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:21:13.358218 1177669 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:21:13.358229 1177669 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:21:13.358236 1177669 command_runner.go:130] > # reload'.
	I1218 00:21:13.358243 1177669 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:21:13.358250 1177669 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:21:13.358258 1177669 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:21:13.358264 1177669 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:21:13.358267 1177669 command_runner.go:130] > [crio]
	I1218 00:21:13.358273 1177669 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:21:13.358277 1177669 command_runner.go:130] > # containers images, in this directory.
	I1218 00:21:13.358820 1177669 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:21:13.358837 1177669 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:21:13.359435 1177669 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:21:13.359448 1177669 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:21:13.359935 1177669 command_runner.go:130] > # imagestore = ""
	I1218 00:21:13.359950 1177669 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:21:13.359963 1177669 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:21:13.360646 1177669 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:21:13.360660 1177669 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:21:13.360667 1177669 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:21:13.360961 1177669 command_runner.go:130] > # storage_option = [
	I1218 00:21:13.361308 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.361321 1177669 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:21:13.361334 1177669 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:21:13.361921 1177669 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:21:13.361934 1177669 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:21:13.361949 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:21:13.361954 1177669 command_runner.go:130] > # always happen on a node reboot
	I1218 00:21:13.362559 1177669 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:21:13.362583 1177669 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:21:13.362590 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:21:13.362595 1177669 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:21:13.363052 1177669 command_runner.go:130] > # version_file_persist = ""
	I1218 00:21:13.363067 1177669 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:21:13.363076 1177669 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:21:13.363680 1177669 command_runner.go:130] > # internal_wipe = true
	I1218 00:21:13.363702 1177669 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:21:13.363709 1177669 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:21:13.364349 1177669 command_runner.go:130] > # internal_repair = true
	I1218 00:21:13.364361 1177669 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:21:13.364368 1177669 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:21:13.364377 1177669 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:21:13.364926 1177669 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:21:13.364942 1177669 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:21:13.364946 1177669 command_runner.go:130] > [crio.api]
	I1218 00:21:13.364951 1177669 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:21:13.365581 1177669 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:21:13.365594 1177669 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:21:13.367685 1177669 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:21:13.367700 1177669 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:21:13.367706 1177669 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:21:13.367710 1177669 command_runner.go:130] > # stream_port = "0"
	I1218 00:21:13.367716 1177669 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:21:13.367723 1177669 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:21:13.367730 1177669 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:21:13.367745 1177669 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:21:13.367752 1177669 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:21:13.367762 1177669 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367766 1177669 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:21:13.367773 1177669 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:21:13.367780 1177669 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367784 1177669 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:21:13.367791 1177669 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:21:13.367802 1177669 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:21:13.367808 1177669 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:21:13.367814 1177669 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:21:13.367835 1177669 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367844 1177669 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:21:13.367853 1177669 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367861 1177669 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:21:13.367868 1177669 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:21:13.367879 1177669 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:21:13.367883 1177669 command_runner.go:130] > [crio.runtime]
	I1218 00:21:13.367893 1177669 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:21:13.367904 1177669 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:21:13.367916 1177669 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:21:13.367926 1177669 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:21:13.367934 1177669 command_runner.go:130] > # default_ulimits = [
	I1218 00:21:13.367937 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.367950 1177669 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:21:13.367958 1177669 command_runner.go:130] > # no_pivot = false
	I1218 00:21:13.367963 1177669 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:21:13.367974 1177669 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:21:13.367979 1177669 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:21:13.367988 1177669 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:21:13.367994 1177669 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:21:13.368004 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368012 1177669 command_runner.go:130] > # conmon = ""
	I1218 00:21:13.368015 1177669 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:21:13.368023 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:21:13.368026 1177669 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:21:13.368035 1177669 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:21:13.368044 1177669 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:21:13.368051 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368058 1177669 command_runner.go:130] > # conmon_env = [
	I1218 00:21:13.368061 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368070 1177669 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:21:13.368076 1177669 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:21:13.368084 1177669 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:21:13.368089 1177669 command_runner.go:130] > # default_env = [
	I1218 00:21:13.368092 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368098 1177669 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:21:13.368111 1177669 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:21:13.368119 1177669 command_runner.go:130] > # selinux = false
	I1218 00:21:13.368125 1177669 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:21:13.368136 1177669 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:21:13.368144 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368148 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.368159 1177669 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:21:13.368167 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368171 1177669 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:21:13.368178 1177669 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:21:13.368189 1177669 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:21:13.368199 1177669 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:21:13.368206 1177669 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:21:13.368212 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368217 1177669 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:21:13.368256 1177669 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:21:13.368261 1177669 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:21:13.368266 1177669 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:21:13.368280 1177669 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:21:13.368287 1177669 command_runner.go:130] > # blockio parameters.
	I1218 00:21:13.368292 1177669 command_runner.go:130] > # blockio_reload = false
	I1218 00:21:13.368298 1177669 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:21:13.368303 1177669 command_runner.go:130] > # irqbalance daemon.
	I1218 00:21:13.368311 1177669 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:21:13.368320 1177669 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:21:13.368327 1177669 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:21:13.368337 1177669 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:21:13.368347 1177669 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:21:13.368357 1177669 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:21:13.368365 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368369 1177669 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:21:13.368375 1177669 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:21:13.368382 1177669 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:21:13.368388 1177669 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:21:13.368396 1177669 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:21:13.368402 1177669 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:21:13.368412 1177669 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:21:13.368419 1177669 command_runner.go:130] > # will be added.
	I1218 00:21:13.368423 1177669 command_runner.go:130] > # default_capabilities = [
	I1218 00:21:13.368430 1177669 command_runner.go:130] > # 	"CHOWN",
	I1218 00:21:13.368434 1177669 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:21:13.368442 1177669 command_runner.go:130] > # 	"FSETID",
	I1218 00:21:13.368445 1177669 command_runner.go:130] > # 	"FOWNER",
	I1218 00:21:13.368457 1177669 command_runner.go:130] > # 	"SETGID",
	I1218 00:21:13.368461 1177669 command_runner.go:130] > # 	"SETUID",
	I1218 00:21:13.368479 1177669 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:21:13.368487 1177669 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:21:13.368490 1177669 command_runner.go:130] > # 	"KILL",
	I1218 00:21:13.368494 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368506 1177669 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:21:13.368515 1177669 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:21:13.368524 1177669 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:21:13.368531 1177669 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:21:13.368539 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368542 1177669 command_runner.go:130] > default_sysctls = [
	I1218 00:21:13.368547 1177669 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:21:13.368554 1177669 command_runner.go:130] > ]
	I1218 00:21:13.368563 1177669 command_runner.go:130] > # List of devices on the host that a
	I1218 00:21:13.368570 1177669 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:21:13.368577 1177669 command_runner.go:130] > # allowed_devices = [
	I1218 00:21:13.368580 1177669 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:21:13.368588 1177669 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:21:13.368594 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368603 1177669 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:21:13.368611 1177669 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:21:13.368618 1177669 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:21:13.368624 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368628 1177669 command_runner.go:130] > # additional_devices = [
	I1218 00:21:13.368633 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368639 1177669 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:21:13.368646 1177669 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:21:13.368649 1177669 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:21:13.368653 1177669 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:21:13.368664 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368673 1177669 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:21:13.368683 1177669 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:21:13.368701 1177669 command_runner.go:130] > # Defaults to false.
	I1218 00:21:13.368712 1177669 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:21:13.368719 1177669 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:21:13.368725 1177669 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:21:13.368734 1177669 command_runner.go:130] > # hooks_dir = [
	I1218 00:21:13.368739 1177669 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:21:13.368745 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368751 1177669 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:21:13.368761 1177669 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:21:13.368770 1177669 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:21:13.368773 1177669 command_runner.go:130] > #
	I1218 00:21:13.368780 1177669 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:21:13.368789 1177669 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:21:13.368795 1177669 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:21:13.368803 1177669 command_runner.go:130] > #
	I1218 00:21:13.368809 1177669 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:21:13.368818 1177669 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:21:13.368829 1177669 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:21:13.368846 1177669 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:21:13.368853 1177669 command_runner.go:130] > #
	I1218 00:21:13.368857 1177669 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:21:13.368866 1177669 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:21:13.368876 1177669 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:21:13.368880 1177669 command_runner.go:130] > # pids_limit = -1
	I1218 00:21:13.368886 1177669 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:21:13.368894 1177669 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:21:13.368904 1177669 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:21:13.368917 1177669 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:21:13.368923 1177669 command_runner.go:130] > # log_size_max = -1
	I1218 00:21:13.368931 1177669 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:21:13.368938 1177669 command_runner.go:130] > # log_to_journald = false
	I1218 00:21:13.368944 1177669 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:21:13.368949 1177669 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:21:13.368959 1177669 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:21:13.368968 1177669 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:21:13.368974 1177669 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:21:13.368981 1177669 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:21:13.368986 1177669 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:21:13.368993 1177669 command_runner.go:130] > # read_only = false
	I1218 00:21:13.369000 1177669 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:21:13.369009 1177669 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:21:13.369017 1177669 command_runner.go:130] > # live configuration reload.
	I1218 00:21:13.369020 1177669 command_runner.go:130] > # log_level = "info"
	I1218 00:21:13.369026 1177669 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:21:13.369031 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.369036 1177669 command_runner.go:130] > # log_filter = ""
	I1218 00:21:13.369043 1177669 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369052 1177669 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:21:13.369056 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369067 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369074 1177669 command_runner.go:130] > # uid_mappings = ""
	I1218 00:21:13.369084 1177669 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369093 1177669 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:21:13.369097 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369105 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369114 1177669 command_runner.go:130] > # gid_mappings = ""
	I1218 00:21:13.369120 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:21:13.369127 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369139 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369150 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369158 1177669 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:21:13.369165 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:21:13.369174 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369184 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369192 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369196 1177669 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:21:13.369208 1177669 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:21:13.369218 1177669 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:21:13.369224 1177669 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:21:13.369231 1177669 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:21:13.369238 1177669 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:21:13.369247 1177669 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:21:13.369256 1177669 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:21:13.369261 1177669 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:21:13.369265 1177669 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:21:13.369273 1177669 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:21:13.369279 1177669 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:21:13.369286 1177669 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:21:13.369293 1177669 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:21:13.369301 1177669 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:21:13.369310 1177669 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:21:13.369320 1177669 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:21:13.369326 1177669 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:21:13.369333 1177669 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:21:13.369339 1177669 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:21:13.369347 1177669 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:21:13.369351 1177669 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:21:13.369359 1177669 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:21:13.369363 1177669 command_runner.go:130] > # pinns_path = ""
	I1218 00:21:13.369368 1177669 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:21:13.369378 1177669 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:21:13.369382 1177669 command_runner.go:130] > # enable_criu_support = true
	I1218 00:21:13.369390 1177669 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:21:13.369400 1177669 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:21:13.369407 1177669 command_runner.go:130] > # enable_pod_events = false
	I1218 00:21:13.369414 1177669 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:21:13.369422 1177669 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:21:13.369426 1177669 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:21:13.369431 1177669 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:21:13.369443 1177669 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:21:13.369457 1177669 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:21:13.369465 1177669 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:21:13.369474 1177669 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:21:13.369481 1177669 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:21:13.369486 1177669 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:21:13.369492 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.369499 1177669 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:21:13.369509 1177669 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:21:13.369515 1177669 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:21:13.369521 1177669 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:21:13.369523 1177669 command_runner.go:130] > #
	I1218 00:21:13.369528 1177669 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:21:13.369536 1177669 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:21:13.369540 1177669 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:21:13.369548 1177669 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:21:13.369553 1177669 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:21:13.369561 1177669 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:21:13.369565 1177669 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:21:13.369574 1177669 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:21:13.369577 1177669 command_runner.go:130] > # monitor_env = []
	I1218 00:21:13.369585 1177669 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:21:13.369590 1177669 command_runner.go:130] > # allowed_annotations = []
	I1218 00:21:13.369595 1177669 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:21:13.369599 1177669 command_runner.go:130] > # no_sync_log = false
	I1218 00:21:13.369603 1177669 command_runner.go:130] > # default_annotations = {}
	I1218 00:21:13.369611 1177669 command_runner.go:130] > # stream_websockets = false
	I1218 00:21:13.369614 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.369664 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.369673 1177669 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:21:13.369680 1177669 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:21:13.369686 1177669 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:21:13.369697 1177669 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:21:13.369708 1177669 command_runner.go:130] > #   in $PATH.
	I1218 00:21:13.369718 1177669 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:21:13.369728 1177669 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:21:13.369735 1177669 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:21:13.369741 1177669 command_runner.go:130] > #   state.
	I1218 00:21:13.369747 1177669 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:21:13.369753 1177669 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:21:13.369759 1177669 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:21:13.369765 1177669 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:21:13.369774 1177669 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:21:13.369780 1177669 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:21:13.369789 1177669 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:21:13.369795 1177669 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:21:13.369805 1177669 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:21:13.369813 1177669 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:21:13.369820 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:21:13.369831 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:21:13.370100 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:21:13.370120 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:21:13.370129 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:21:13.370143 1177669 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:21:13.370151 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:21:13.370162 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:21:13.370169 1177669 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:21:13.370176 1177669 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:21:13.370187 1177669 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:21:13.370195 1177669 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:21:13.370206 1177669 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:21:13.370213 1177669 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:21:13.370219 1177669 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:21:13.370232 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:21:13.370239 1177669 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:21:13.370249 1177669 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:21:13.370266 1177669 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:21:13.370271 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:21:13.370283 1177669 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:21:13.370288 1177669 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:21:13.370295 1177669 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:21:13.370305 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:21:13.370313 1177669 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:21:13.370317 1177669 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:21:13.370329 1177669 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:21:13.370338 1177669 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:21:13.370350 1177669 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:21:13.370357 1177669 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:21:13.370367 1177669 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:21:13.370375 1177669 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:21:13.370388 1177669 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:21:13.370395 1177669 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:21:13.370408 1177669 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:21:13.370420 1177669 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:21:13.370425 1177669 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:21:13.370437 1177669 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:21:13.370445 1177669 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:21:13.370459 1177669 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:21:13.370466 1177669 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:21:13.370473 1177669 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:21:13.370485 1177669 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:21:13.370488 1177669 command_runner.go:130] > #
	I1218 00:21:13.370493 1177669 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:21:13.370496 1177669 command_runner.go:130] > #
	I1218 00:21:13.370506 1177669 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:21:13.370513 1177669 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:21:13.370516 1177669 command_runner.go:130] > #
	I1218 00:21:13.370525 1177669 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:21:13.370537 1177669 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:21:13.370545 1177669 command_runner.go:130] > #
	I1218 00:21:13.370553 1177669 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:21:13.370556 1177669 command_runner.go:130] > # feature.
	I1218 00:21:13.370563 1177669 command_runner.go:130] > #
	I1218 00:21:13.370569 1177669 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:21:13.370576 1177669 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:21:13.370587 1177669 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:21:13.370594 1177669 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:21:13.370600 1177669 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:21:13.370610 1177669 command_runner.go:130] > #
	I1218 00:21:13.370618 1177669 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:21:13.370625 1177669 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:21:13.370628 1177669 command_runner.go:130] > #
	I1218 00:21:13.370638 1177669 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:21:13.370644 1177669 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:21:13.370647 1177669 command_runner.go:130] > #
	I1218 00:21:13.370657 1177669 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:21:13.370664 1177669 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:21:13.370667 1177669 command_runner.go:130] > # limitation.
	I1218 00:21:13.370672 1177669 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:21:13.370680 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:21:13.370684 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.370688 1177669 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:21:13.370695 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.370699 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371091 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371100 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371106 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371111 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371151 1177669 command_runner.go:130] > allowed_annotations = [
	I1218 00:21:13.371159 1177669 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:21:13.371163 1177669 command_runner.go:130] > ]
	I1218 00:21:13.371167 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371172 1177669 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:21:13.371180 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:21:13.371184 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.371188 1177669 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:21:13.371224 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.371229 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371233 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371242 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371253 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371257 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371263 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371305 1177669 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:21:13.371314 1177669 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:21:13.371321 1177669 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:21:13.371342 1177669 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:21:13.371388 1177669 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:21:13.371402 1177669 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:21:13.371414 1177669 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:21:13.371421 1177669 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:21:13.371470 1177669 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:21:13.371479 1177669 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:21:13.371490 1177669 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:21:13.371528 1177669 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:21:13.371536 1177669 command_runner.go:130] > # Example:
	I1218 00:21:13.371546 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:21:13.371551 1177669 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:21:13.371556 1177669 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:21:13.371561 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:21:13.371569 1177669 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:21:13.371573 1177669 command_runner.go:130] > # cpushares = "5"
	I1218 00:21:13.371606 1177669 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:21:13.371613 1177669 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:21:13.371617 1177669 command_runner.go:130] > # cpulimit = "35"
	I1218 00:21:13.371620 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.371629 1177669 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:21:13.371636 1177669 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:21:13.371647 1177669 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:21:13.371690 1177669 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:21:13.371702 1177669 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:21:13.371713 1177669 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:21:13.371718 1177669 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:21:13.371726 1177669 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:21:13.371777 1177669 command_runner.go:130] > # Default value is set to true
	I1218 00:21:13.371785 1177669 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:21:13.371791 1177669 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:21:13.371796 1177669 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:21:13.371805 1177669 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:21:13.371846 1177669 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:21:13.371855 1177669 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:21:13.371869 1177669 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:21:13.371873 1177669 command_runner.go:130] > # timezone = ""
	I1218 00:21:13.371880 1177669 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:21:13.371883 1177669 command_runner.go:130] > #
	I1218 00:21:13.371923 1177669 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:21:13.371933 1177669 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:21:13.371937 1177669 command_runner.go:130] > [crio.image]
	I1218 00:21:13.371948 1177669 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:21:13.371953 1177669 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:21:13.371960 1177669 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:21:13.372001 1177669 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372008 1177669 command_runner.go:130] > # global_auth_file = ""
	I1218 00:21:13.372014 1177669 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:21:13.372020 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372029 1177669 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.372036 1177669 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:21:13.372043 1177669 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372052 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372057 1177669 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:21:13.372094 1177669 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:21:13.372111 1177669 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:21:13.372119 1177669 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:21:13.372125 1177669 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:21:13.372134 1177669 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:21:13.372140 1177669 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:21:13.372147 1177669 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:21:13.372187 1177669 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:21:13.372197 1177669 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:21:13.372204 1177669 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:21:13.372215 1177669 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:21:13.372270 1177669 command_runner.go:130] > # pinned_images = [
	I1218 00:21:13.372283 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372290 1177669 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:21:13.372301 1177669 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:21:13.372308 1177669 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:21:13.372319 1177669 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:21:13.372324 1177669 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:21:13.372362 1177669 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:21:13.372371 1177669 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:21:13.372384 1177669 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:21:13.372391 1177669 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:21:13.372402 1177669 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:21:13.372408 1177669 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:21:13.372414 1177669 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:21:13.372450 1177669 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:21:13.372460 1177669 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:21:13.372464 1177669 command_runner.go:130] > # changing them here.
	I1218 00:21:13.372475 1177669 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:21:13.372479 1177669 command_runner.go:130] > # insecure_registries = [
	I1218 00:21:13.372482 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372489 1177669 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:21:13.372498 1177669 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:21:13.372502 1177669 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:21:13.372541 1177669 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:21:13.372549 1177669 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:21:13.372559 1177669 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:21:13.372567 1177669 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:21:13.372572 1177669 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:21:13.372582 1177669 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:21:13.372591 1177669 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:21:13.372630 1177669 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:21:13.372638 1177669 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:21:13.372643 1177669 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:21:13.372650 1177669 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:21:13.372667 1177669 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:21:13.372672 1177669 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:21:13.372678 1177669 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:21:13.372721 1177669 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:21:13.372730 1177669 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:21:13.372735 1177669 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:21:13.372746 1177669 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:21:13.372750 1177669 command_runner.go:130] > # CNI plugins.
	I1218 00:21:13.372753 1177669 command_runner.go:130] > [crio.network]
	I1218 00:21:13.372759 1177669 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:21:13.372769 1177669 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:21:13.372773 1177669 command_runner.go:130] > # cni_default_network = ""
	I1218 00:21:13.372780 1177669 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:21:13.372837 1177669 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:21:13.372851 1177669 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:21:13.372856 1177669 command_runner.go:130] > # plugin_dirs = [
	I1218 00:21:13.372860 1177669 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:21:13.372863 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372867 1177669 command_runner.go:130] > # List of included pod metrics.
	I1218 00:21:13.372903 1177669 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:21:13.372909 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372923 1177669 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:21:13.372927 1177669 command_runner.go:130] > [crio.metrics]
	I1218 00:21:13.372933 1177669 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:21:13.372941 1177669 command_runner.go:130] > # enable_metrics = false
	I1218 00:21:13.372946 1177669 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:21:13.372951 1177669 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:21:13.372958 1177669 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:21:13.372999 1177669 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:21:13.373006 1177669 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:21:13.373010 1177669 command_runner.go:130] > # metrics_collectors = [
	I1218 00:21:13.373018 1177669 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:21:13.373023 1177669 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:21:13.373033 1177669 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:21:13.373037 1177669 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:21:13.373042 1177669 command_runner.go:130] > # 	"operations_total",
	I1218 00:21:13.373077 1177669 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:21:13.373084 1177669 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:21:13.373089 1177669 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:21:13.373093 1177669 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:21:13.373098 1177669 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:21:13.373106 1177669 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:21:13.373111 1177669 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:21:13.373115 1177669 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:21:13.373120 1177669 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:21:13.373133 1177669 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:21:13.373167 1177669 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:21:13.373176 1177669 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:21:13.373179 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.373190 1177669 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:21:13.373199 1177669 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:21:13.373205 1177669 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:21:13.373209 1177669 command_runner.go:130] > # metrics_port = 9090
	I1218 00:21:13.373214 1177669 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:21:13.373222 1177669 command_runner.go:130] > # metrics_socket = ""
	I1218 00:21:13.373425 1177669 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:21:13.373436 1177669 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:21:13.373448 1177669 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:21:13.373454 1177669 command_runner.go:130] > # certificate on any modification event.
	I1218 00:21:13.373457 1177669 command_runner.go:130] > # metrics_cert = ""
	I1218 00:21:13.373463 1177669 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:21:13.373472 1177669 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:21:13.373475 1177669 command_runner.go:130] > # metrics_key = ""
	I1218 00:21:13.373510 1177669 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:21:13.373518 1177669 command_runner.go:130] > [crio.tracing]
	I1218 00:21:13.373528 1177669 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:21:13.373538 1177669 command_runner.go:130] > # enable_tracing = false
	I1218 00:21:13.373545 1177669 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:21:13.373549 1177669 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:21:13.373560 1177669 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:21:13.373565 1177669 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:21:13.373569 1177669 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:21:13.373602 1177669 command_runner.go:130] > [crio.nri]
	I1218 00:21:13.373606 1177669 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:21:13.373614 1177669 command_runner.go:130] > # enable_nri = true
	I1218 00:21:13.373618 1177669 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:21:13.373623 1177669 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:21:13.373628 1177669 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:21:13.373632 1177669 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:21:13.373641 1177669 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:21:13.373646 1177669 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:21:13.373652 1177669 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:21:13.374323 1177669 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:21:13.374347 1177669 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:21:13.374353 1177669 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:21:13.374359 1177669 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:21:13.374369 1177669 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:21:13.374374 1177669 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:21:13.374384 1177669 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:21:13.374396 1177669 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:21:13.374400 1177669 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:21:13.374404 1177669 command_runner.go:130] > # - OCI hook injection
	I1218 00:21:13.374410 1177669 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:21:13.374419 1177669 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:21:13.374424 1177669 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:21:13.374429 1177669 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:21:13.374440 1177669 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:21:13.374447 1177669 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:21:13.374453 1177669 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:21:13.374461 1177669 command_runner.go:130] > #
	I1218 00:21:13.374470 1177669 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:21:13.374475 1177669 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:21:13.374481 1177669 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:21:13.374487 1177669 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:21:13.374497 1177669 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:21:13.374503 1177669 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:21:13.374508 1177669 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:21:13.374517 1177669 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:21:13.374520 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.374526 1177669 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:21:13.374532 1177669 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:21:13.374540 1177669 command_runner.go:130] > [crio.stats]
	I1218 00:21:13.374546 1177669 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:21:13.374552 1177669 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:21:13.374557 1177669 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:21:13.374567 1177669 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:21:13.374574 1177669 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:21:13.374578 1177669 command_runner.go:130] > # collection_period = 0
	I1218 00:21:13.375235 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337716712Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:21:13.375252 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337755529Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:21:13.375261 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337787676Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:21:13.375269 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337813217Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:21:13.375279 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337887603Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:21:13.375295 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.338323059Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:21:13.375307 1177669 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:21:13.375636 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:21:13.375654 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:21:13.375670 1177669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:21:13.375692 1177669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-240845 NodeName:functional-240845 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:21:13.375818 1177669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-240845"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:21:13.375897 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:21:13.382943 1177669 command_runner.go:130] > kubeadm
	I1218 00:21:13.382987 1177669 command_runner.go:130] > kubectl
	I1218 00:21:13.382992 1177669 command_runner.go:130] > kubelet
	I1218 00:21:13.383228 1177669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:21:13.383323 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:21:13.390563 1177669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (367 bytes)
	I1218 00:21:13.402469 1177669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:21:13.415695 1177669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2214 bytes)
	I1218 00:21:13.427935 1177669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:21:13.431432 1177669 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:21:13.431528 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:13.573724 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:13.587283 1177669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845 for IP: 192.168.49.2
	I1218 00:21:13.587308 1177669 certs.go:195] generating shared ca certs ...
	I1218 00:21:13.587325 1177669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:13.587468 1177669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:21:13.587523 1177669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:21:13.587535 1177669 certs.go:257] generating profile certs ...
	I1218 00:21:13.587627 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key
	I1218 00:21:13.587682 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key.83c30509
	I1218 00:21:13.587749 1177669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key
	I1218 00:21:13.587763 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:21:13.587778 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:21:13.587791 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:21:13.587807 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:21:13.587827 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:21:13.587840 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:21:13.587855 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:21:13.587866 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:21:13.587928 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:21:13.587965 1177669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:21:13.587976 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:21:13.588004 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:21:13.588031 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:21:13.588058 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:21:13.588108 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:21:13.588142 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.588156 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.588167 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.588757 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:21:13.607287 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:21:13.626005 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:21:13.643497 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:21:13.660653 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:21:13.677616 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1218 00:21:13.694313 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:21:13.711161 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:21:13.728011 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:21:13.745006 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:21:13.761771 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:21:13.778664 1177669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:21:13.791171 1177669 ssh_runner.go:195] Run: openssl version
	I1218 00:21:13.796833 1177669 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:21:13.797285 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.804618 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:21:13.812913 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816610 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816655 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816704 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.857240 1177669 command_runner.go:130] > 51391683
	I1218 00:21:13.857318 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:21:13.864756 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.871981 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:21:13.879459 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883023 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883055 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883126 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.923479 1177669 command_runner.go:130] > 3ec20f2e
	I1218 00:21:13.923967 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:21:13.931505 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.938743 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:21:13.946369 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950234 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950276 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950327 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.990419 1177669 command_runner.go:130] > b5213941
	I1218 00:21:13.990837 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:21:13.998401 1177669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003376 1177669 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003402 1177669 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:21:14.003409 1177669 command_runner.go:130] > Device: 259,1	Inode: 1327743     Links: 1
	I1218 00:21:14.003416 1177669 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:14.003422 1177669 command_runner.go:130] > Access: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003427 1177669 command_runner.go:130] > Modify: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003432 1177669 command_runner.go:130] > Change: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003438 1177669 command_runner.go:130] >  Birth: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003512 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:21:14.045243 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.045691 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:21:14.086658 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.086738 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:21:14.127897 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.128372 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:21:14.168626 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.169131 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:21:14.209194 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.209712 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:21:14.250333 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.250470 1177669 kubeadm.go:401] StartCluster: {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISer
verNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false
DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:21:14.250558 1177669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:21:14.250623 1177669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:21:14.277777 1177669 command_runner.go:130] > e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563
	I1218 00:21:14.277803 1177669 command_runner.go:130] > 0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c
	I1218 00:21:14.277811 1177669 command_runner.go:130] > 95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e
	I1218 00:21:14.277820 1177669 command_runner.go:130] > 1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5
	I1218 00:21:14.277826 1177669 command_runner.go:130] > 9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1
	I1218 00:21:14.277832 1177669 command_runner.go:130] > 3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad
	I1218 00:21:14.277838 1177669 command_runner.go:130] > cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15
	I1218 00:21:14.277846 1177669 command_runner.go:130] > 38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68
	I1218 00:21:14.277857 1177669 command_runner.go:130] > 1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b
	I1218 00:21:14.277868 1177669 command_runner.go:130] > 61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a
	I1218 00:21:14.277874 1177669 command_runner.go:130] > 98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe
	I1218 00:21:14.277883 1177669 command_runner.go:130] > 891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b
	I1218 00:21:14.277889 1177669 command_runner.go:130] > 2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de
	I1218 00:21:14.277898 1177669 command_runner.go:130] > b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06
	I1218 00:21:14.280281 1177669 cri.go:89] found id: "e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563"
	I1218 00:21:14.280303 1177669 cri.go:89] found id: "0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c"
	I1218 00:21:14.280308 1177669 cri.go:89] found id: "95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e"
	I1218 00:21:14.280312 1177669 cri.go:89] found id: "1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5"
	I1218 00:21:14.280315 1177669 cri.go:89] found id: "9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1"
	I1218 00:21:14.280319 1177669 cri.go:89] found id: "3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad"
	I1218 00:21:14.280323 1177669 cri.go:89] found id: "cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15"
	I1218 00:21:14.280326 1177669 cri.go:89] found id: "38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68"
	I1218 00:21:14.280329 1177669 cri.go:89] found id: "1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b"
	I1218 00:21:14.280337 1177669 cri.go:89] found id: "61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a"
	I1218 00:21:14.280343 1177669 cri.go:89] found id: "98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe"
	I1218 00:21:14.280347 1177669 cri.go:89] found id: "891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b"
	I1218 00:21:14.280355 1177669 cri.go:89] found id: "2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	I1218 00:21:14.280359 1177669 cri.go:89] found id: "b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06"
	I1218 00:21:14.280362 1177669 cri.go:89] found id: ""
	I1218 00:21:14.280415 1177669 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:21:14.291297 1177669 command_runner.go:130] ! time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	W1218 00:21:14.291357 1177669 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	I1218 00:21:14.291439 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:21:14.298396 1177669 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:21:14.298416 1177669 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:21:14.298422 1177669 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:21:14.298426 1177669 command_runner.go:130] > member
	I1218 00:21:14.299333 1177669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:21:14.299377 1177669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:21:14.299453 1177669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:21:14.306750 1177669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:21:14.307329 1177669 kubeconfig.go:125] found "functional-240845" server: "https://192.168.49.2:8441"
	I1218 00:21:14.308688 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.308922 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.310273 1177669 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:21:14.310295 1177669 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:21:14.310301 1177669 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:21:14.310306 1177669 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:21:14.310311 1177669 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:21:14.310598 1177669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:21:14.310964 1177669 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:21:14.321005 1177669 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:21:14.321038 1177669 kubeadm.go:602] duration metric: took 21.641512ms to restartPrimaryControlPlane
	I1218 00:21:14.321068 1177669 kubeadm.go:403] duration metric: took 70.601924ms to StartCluster
	I1218 00:21:14.321095 1177669 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.321175 1177669 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.321832 1177669 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.322054 1177669 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:21:14.322232 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:21:14.322270 1177669 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:21:14.322334 1177669 addons.go:70] Setting storage-provisioner=true in profile "functional-240845"
	I1218 00:21:14.322346 1177669 addons.go:239] Setting addon storage-provisioner=true in "functional-240845"
	W1218 00:21:14.322351 1177669 addons.go:248] addon storage-provisioner should already be in state true
	I1218 00:21:14.322373 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.322797 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.323222 1177669 addons.go:70] Setting default-storageclass=true in profile "functional-240845"
	I1218 00:21:14.323243 1177669 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-240845"
	I1218 00:21:14.323528 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.326222 1177669 out.go:179] * Verifying Kubernetes components...
	I1218 00:21:14.329298 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:14.352407 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.352567 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.353875 1177669 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:21:14.354521 1177669 addons.go:239] Setting addon default-storageclass=true in "functional-240845"
	W1218 00:21:14.354541 1177669 addons.go:248] addon default-storageclass should already be in state true
	I1218 00:21:14.354568 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.355010 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.357054 1177669 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.357084 1177669 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:21:14.357149 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.385892 1177669 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.385914 1177669 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:21:14.385974 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.412313 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.438252 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.538332 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:14.555412 1177669 node_ready.go:35] waiting up to 6m0s for node "functional-240845" to be "Ready" ...
	I1218 00:21:14.556627 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.558515 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:14.558665 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:14.559006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:14.569919 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.635955 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.636102 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.636146 1177669 retry.go:31] will retry after 274.076226ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.646979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.650760 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.650797 1177669 retry.go:31] will retry after 360.821893ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.911221 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.974464 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.974555 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.974595 1177669 retry.go:31] will retry after 225.739861ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.012854 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.055958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.056036 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.056342 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.079682 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.079793 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.079817 1177669 retry.go:31] will retry after 552.403697ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.200970 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.261673 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.261728 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.261746 1177669 retry.go:31] will retry after 669.780864ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.556091 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.632797 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.699577 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.699638 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.699664 1177669 retry.go:31] will retry after 634.295794ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.931763 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.990067 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.993514 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.993545 1177669 retry.go:31] will retry after 1.113615509s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.055858 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:16.334650 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:16.392078 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:16.395777 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.395856 1177669 retry.go:31] will retry after 558.474178ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.556248 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.556629 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:16.556701 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:16.955131 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:17.055832 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.055954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.056319 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:17.076617 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.076722 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.076755 1177669 retry.go:31] will retry after 1.676176244s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.108039 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:17.223472 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.223571 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.223606 1177669 retry.go:31] will retry after 1.165701868s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.556175 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.556607 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.056383 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.056458 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.056745 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.390333 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:18.466841 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.466880 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.466899 1177669 retry.go:31] will retry after 1.475434566s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.556290 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.556363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.556640 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.753095 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:18.817795 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.817871 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.817893 1177669 retry.go:31] will retry after 1.833170296s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:19.056294 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.056363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:19.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:19.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.556536 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.556903 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:19.943440 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:20.003817 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.008032 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.008069 1177669 retry.go:31] will retry after 3.979109659s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.056345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.056668 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.556404 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.556792 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.652153 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:20.711890 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.715639 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.715672 1177669 retry.go:31] will retry after 3.637109781s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:21.056958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.057040 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.057388 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:21.057444 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:21.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.555927 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.556005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.556330 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.056025 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.056094 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.056444 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.556246 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.556345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.556676 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:23.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:23.987349 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:24.051441 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.051487 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.051524 1177669 retry.go:31] will retry after 5.3171516s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.056654 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.056732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.057111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:24.353838 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:24.413422 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.413469 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.413487 1177669 retry.go:31] will retry after 3.340127313s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.555854 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.555928 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:26.056042 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.056124 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.056522 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:26.056585 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:26.556332 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.556411 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.056517 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.056589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.056942 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.555642 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.754507 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:27.812979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:27.813026 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:27.813045 1177669 retry.go:31] will retry after 6.95951013s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:28.056456 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.056550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:28.057006 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:28.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.055872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.368874 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:29.425933 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:29.429391 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.429423 1177669 retry.go:31] will retry after 6.711424265s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.555717 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.055742 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.055823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:30.556179 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:31.055879 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.055958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.056290 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:31.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.556084 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.056199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.555959 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.556028 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.556363 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:32.556413 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:33.055772 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.055844 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.056367 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:33.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.556178 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.055882 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.055955 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.556326 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.556397 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:34.556796 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:34.773144 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:34.829958 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:34.833899 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:34.833929 1177669 retry.go:31] will retry after 8.542321591s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:35.056329 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.056407 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.056770 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:35.556516 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.556605 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.556959 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.057369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.057701 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.141963 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:36.202438 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:36.202477 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.202496 1177669 retry.go:31] will retry after 7.758270018s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:37.055818 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.055893 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:37.056274 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:37.555746 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.555822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.055812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.056254 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.556149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:39.055837 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.056297 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:39.056352 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:39.556245 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.556663 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.056509 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.056606 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.056917 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.555706 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.055770 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.056183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.555731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:41.556201 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:42.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.055854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:42.556028 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.556119 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.556476 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.056276 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.056351 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.056698 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.377156 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:43.435792 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:43.439377 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.439408 1177669 retry.go:31] will retry after 18.255208537s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.556665 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.556738 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.557098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:43.557163 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:43.961544 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:44.047619 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:44.047656 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.047681 1177669 retry.go:31] will retry after 16.124184127s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.055890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.056245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:44.556158 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.556259 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.556606 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:45.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.057068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1218 00:21:45.555737 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.555812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.556144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:46.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:46.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:46.555906 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.556364 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.055801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.555784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.056189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:48.056258 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:48.555948 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.556021 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.556370 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.056069 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.056148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.056513 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.556531 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.556886 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:50.056538 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.056619 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.056964 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:50.057018 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:50.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.556116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.055815 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.055884 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.056198 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.555734 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:52.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:53.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.056005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.056346 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:53.556049 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.056030 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.056102 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.056454 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.556467 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.556542 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.556870 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:54.556927 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:55.055618 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.055704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.056046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:55.555616 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.555993 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.055712 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:57.055721 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:57.056210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:57.555892 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.555965 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.556299 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.555793 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.555877 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.556239 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:59.556292 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:00.055898 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.055985 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.056349 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:00.172859 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:00.349113 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:00.349165 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.349188 1177669 retry.go:31] will retry after 15.178958797s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.556482 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.556554 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.056710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.057020 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.555743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.695619 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:01.764253 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:01.768251 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:01.768286 1177669 retry.go:31] will retry after 20.261734519s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:02.055637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.055714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.056058 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:02.056113 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:02.555751 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.555820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.555659 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.555732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.556022 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:04.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.056080 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:04.056144 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:04.556235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.556662 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.056450 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.056522 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.056859 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.556627 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.556731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.557039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:06.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:06.056174 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:06.555712 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.556120 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.056150 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.555911 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.555990 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.556341 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:08.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.055811 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.056155 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:08.056203 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:08.555901 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.555978 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.556327 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:10.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.055797 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:10.056287 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:10.555954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.556049 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.556390 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.555929 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.056135 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:12.556162 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:13.055804 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.056174 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:13.555873 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.555943 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.056098 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.056468 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.556454 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.556529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.556828 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:14.556880 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:15.056592 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.056660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.057019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.528571 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:15.555957 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.556023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.556331 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.591509 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:15.594869 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:15.594900 1177669 retry.go:31] will retry after 30.932709272s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:16.056512 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.056582 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.056902 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:16.555621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.555718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:17.055743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.056124 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:17.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:17.555739 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.555834 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.055987 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.056365 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.556060 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.556132 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.556480 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:19.056235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.056623 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:19.056697 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:19.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.556660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.556999 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.056621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.056700 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.057051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.555764 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.556195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.055711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:21.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:22.030818 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:22.056348 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.056446 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.056751 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:22.091069 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:22.094766 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.094798 1177669 retry.go:31] will retry after 47.715756714s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.556528 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.556883 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.056081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.555699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.555792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:24.055868 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.055942 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.056302 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:24.056357 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:24.556255 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.556349 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.056263 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.056721 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.556539 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.556623 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.557021 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.055767 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.055851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.555952 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.556027 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:26.556419 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:27.056075 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:27.556423 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.556518 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.056638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.057038 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.555732 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.555814 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.556167 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:29.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:29.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:29.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.055846 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.055931 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.056335 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.556057 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.556129 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.556500 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:31.056282 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.056362 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.056704 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:31.056761 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:31.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.556564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.556896 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.055712 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.556248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.055753 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.556054 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.556161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.556684 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:33.556740 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:34.056460 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.056532 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.056854 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:34.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.556213 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.555889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.556242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:36.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.056089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:36.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:36.555705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.055826 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.056241 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.555756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:38.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.056147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:38.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:38.555776 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.556214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.056109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.555711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.555807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.556166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:40.055954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.056034 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.056481 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:40.056546 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:40.556379 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.556469 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.556814 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.056496 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.056800 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.556572 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.556672 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.557008 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.055744 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.056248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.555835 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.555911 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.556276 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:42.556330 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:43.056000 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.056095 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.056464 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:43.556247 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.556319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.056503 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.056852 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.556012 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.556109 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:44.556512 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:45.057726 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.057809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.058234 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:45.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.556053 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.556425 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.055813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.528809 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:46.556363 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.556430 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.556863 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:46.556912 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:46.592076 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592111 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592213 1177669 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:22:47.056004 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.056462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:47.556264 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.556334 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.556652 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.056410 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.056481 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.056790 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.556515 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.556589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.556921 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:48.556975 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:49.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.055730 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:49.555733 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.555821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.556169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.055866 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.055935 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.555707 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.555815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:51.056519 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.056627 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:51.057002 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:51.555636 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.555709 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.556029 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.055820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.555872 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.555945 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.055695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.055766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.056179 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.555862 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.555934 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:53.556377 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:54.056043 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.056137 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.056494 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:54.556337 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.556432 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.556763 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.056566 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.056640 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.056965 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.555663 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.556084 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:56.056189 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:56.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.556131 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.056558 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.056624 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.056923 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.556638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.556705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.556914 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.055638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.055992 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.556608 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.556858 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:58.556906 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:59.055615 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.055693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.055962 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:59.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.055787 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.055870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.056214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.555725 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:01.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:01.056324 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:01.555994 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.556068 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.556424 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.056333 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.556470 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.556547 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.055624 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.055721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.056047 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:03.556183 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:04.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:04.556084 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.556154 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.556510 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.056318 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.056386 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.056739 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.556502 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.556575 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.556888 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:05.556938 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:06.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.056072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:06.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.555824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.556163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.056617 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.056703 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.057017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.555759 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.556245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:08.055620 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.055732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:08.056120 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:08.555828 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.555904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.055992 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.056064 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.056490 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.556279 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.811086 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:23:09.870262 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873844 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873941 1177669 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:23:09.877215 1177669 out.go:179] * Enabled addons: 
	I1218 00:23:09.880843 1177669 addons.go:530] duration metric: took 1m55.558566134s for enable addons: enabled=[]
	I1218 00:23:10.056212 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.056346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.056713 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:10.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:10.556554 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.556967 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.055656 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.055785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.555809 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.555880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.556212 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.055838 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.556050 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:12.556108 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:13.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.055825 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.056171 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:13.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.556080 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.556462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.055821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.056182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.556315 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.556385 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:14.556741 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:15.056401 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.056470 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.056793 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:15.556411 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.556480 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.556780 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.056647 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.056963 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:17.055850 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.055925 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.056282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:17.056334 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:17.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.556122 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.556497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.056286 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.056361 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.056685 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.556408 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.556802 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:19.056570 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.056646 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.057040 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:19.057095 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:19.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.555860 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.555958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.556317 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.056081 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.056416 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.555830 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.555903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:21.556323 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:22.056015 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.056091 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.056432 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:22.556188 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.556285 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.556619 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.056387 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.056459 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.056805 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.556577 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.556991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:23.557043 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:24.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:24.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.556090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.556429 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.056237 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.056319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.056651 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.556407 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.556484 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.556804 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:26.056601 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.056678 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.057039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:26.057096 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:26.556349 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.556417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.556670 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.056529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.555635 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.555714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.556073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.556018 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.556350 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:28.556398 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:29.056066 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.056141 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.056558 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:29.556410 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.556482 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.556819 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.056192 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.056697 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.556347 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.556425 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.556813 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:30.556882 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:31.056651 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.056724 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.057110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:31.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:33.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:33.056171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:33.556520 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.556626 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.557595 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.055711 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.555832 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.555907 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.556266 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:35.055820 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.056262 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:35.056317 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:35.555980 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.556055 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.556475 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.056253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.056329 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.056689 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.556253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.556322 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.556585 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:37.056343 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.056417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.056777 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:37.056832 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:37.556557 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.556628 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.055768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.055671 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.056076 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.555915 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.555994 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:39.556347 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:40.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.056458 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:40.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.556320 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.556648 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.056467 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.056544 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.556710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:41.557023 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:42.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:42.555713 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.055805 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.055880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.056188 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:44.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.055912 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.056273 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:44.056335 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:44.555920 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.555993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.556368 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.058236 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.058319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.058728 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.556602 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.556934 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.055727 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.056067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.556151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:46.556209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:47.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.056162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:47.555674 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.055810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.556441 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.556514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.556832 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:48.556892 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:49.056604 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.056674 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.057002 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:49.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.555771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.055675 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:51.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:51.056177 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:51.555865 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.555937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.556310 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.555777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:53.055816 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.055886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.056231 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:53.056281 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:53.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.556010 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.556362 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.556026 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.556097 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.556417 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.055678 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.056101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.555734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:55.556129 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:56.055730 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:56.555867 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.555946 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.556300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.056090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.056457 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.556250 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.556323 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.556654 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:57.556712 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:58.056487 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:58.555623 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.055906 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.055982 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.056358 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.555710 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.555803 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:00.059640 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.059720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.060067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:00.060115 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:00.556240 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.556671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.056422 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.056490 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.056823 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.556575 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.556648 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.556984 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.056108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.555781 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:02.556145 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:03.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:03.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.555789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.056105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.556112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:04.556502 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:05.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.056331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.056671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:05.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.556557 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.055645 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.055718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.056059 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.555753 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.556081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:07.056275 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.056343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.056625 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:07.056668 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:07.556435 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.556511 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.055588 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.055660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.056003 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.055807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.055881 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.056246 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.555809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:09.556165 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:10.055865 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.055961 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.056303 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:10.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.556089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.055834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.056300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.556519 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:11.556574 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:12.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.056402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.056727 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:12.556537 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.556620 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.556973 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.055664 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.555826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.556183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:14.055916 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.055993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.056372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:14.056425 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:14.556306 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.556384 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.556722 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.056477 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.056546 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.056868 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.556714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.557060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.556138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:16.556191 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:17.055685 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.056060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:17.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.056016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.555699 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:19.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:19.056164 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:19.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.055623 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.055701 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.056014 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.555727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.055681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:21.556151 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:22.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.056666 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:22.556365 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.556434 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.556767 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.056542 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.056618 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.056908 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.555630 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.556032 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:24.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:24.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:24.556065 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.556148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.556518 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.056084 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.056512 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.556289 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.556691 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:26.056500 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.056600 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.056966 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:26.057019 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:26.555679 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.556107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.055812 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.556038 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.556103 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:28.556166 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:29.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.555778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.555730 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:30.556245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:31.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:31.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.055797 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.555678 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:33.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:33.056107 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:33.555768 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.556187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.055758 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.055833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.556178 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:35.056326 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.056396 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.056747 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:35.056801 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:35.556586 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.556657 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.557044 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.055782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.555908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.556282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:37.056627 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.057073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:37.057140 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:37.555781 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.555850 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.055869 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.055947 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.056284 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.055844 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.055924 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.056291 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:39.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:40.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:40.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.056055 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.555718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.556072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:42.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:42.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:42.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.556016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.556296 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.556369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:44.556718 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:45.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.057363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.057788 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:45.556579 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.556652 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.556974 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.555790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:47.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.055779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.056086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:47.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:47.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.555807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.555758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.556101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:49.556155 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:50.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.056148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:50.555821 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.555892 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.556250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.056032 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.056394 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:51.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:52.055852 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.056308 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:52.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.556071 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.556442 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.056207 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.056295 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.056620 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.556372 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.556448 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.556772 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:53.556829 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:54.056587 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.056664 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.057045 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:54.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.556382 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.056125 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:56.055825 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.055904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.056298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:56.056356 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:56.556003 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.556076 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.056092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.555906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.556260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:58.556314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:59.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.055748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:59.555874 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.555954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.055976 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.056062 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.056441 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.556138 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.556250 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.556688 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:00.556759 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:01.056530 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.056604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.056936 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:01.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.555731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.056275 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.555950 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.556022 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.556393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:03.056088 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.056161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.056527 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:03.056581 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:03.556314 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.556377 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.556644 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.056325 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.056398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.056741 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.556656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.556735 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.557093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.055643 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.556394 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.556464 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.556836 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:05.556889 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:06.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.057085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:06.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.056100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.556130 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:08.055819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.056255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:08.056314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:08.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.556052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.556389 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.556157 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.556549 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:10.056352 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.056426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.056786 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:10.056843 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:10.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.556658 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.557006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.055714 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.555890 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.555963 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.556324 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.056036 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.056112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.056497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.556273 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.556347 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:12.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:13.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.056492 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.056825 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:13.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.556340 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.556615 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.055978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.056067 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.555796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.556186 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:15.055759 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.055835 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.056169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:15.056244 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:15.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.556434 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.056196 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.056293 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.056642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.556550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.556891 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.055661 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.056001 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.556096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:17.556152 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:18.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:18.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.055840 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.055916 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.555697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:19.556192 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:20.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:20.555796 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.555870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.556255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.055952 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.056023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.056395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.556064 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.556138 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.556504 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:21.556557 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:22.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.056350 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.056700 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:22.556495 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.556573 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.556915 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.055663 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.055991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.555759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:24.055734 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.055816 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.056168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:24.056239 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:24.556073 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.556516 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:26.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.055868 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.056210 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:26.056286 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:26.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.556109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.055906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.056250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.555667 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.556156 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:28.556210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:29.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:29.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.056614 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.555633 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.555715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:31.055809 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.056307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:31.056368 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:31.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.055756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.555778 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.555851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:33.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.056351 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:33.056404 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:33.556040 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.556451 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.056004 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.056660 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.556087 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.555845 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.556288 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:35.556349 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:36.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:36.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.555888 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:38.055607 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.055689 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.056039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:38.056098 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:38.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.055853 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.056286 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.556210 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.556639 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:40.056445 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.056865 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:40.056930 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:40.556620 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.556695 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.557017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.555789 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.555863 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.556189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.056163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.555934 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.556328 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:42.556374 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:43.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:43.555816 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.556292 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.055998 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.056073 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.556595 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.556924 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:44.556979 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:45.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.056201 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:45.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.056492 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.056876 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.556642 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.556716 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.557036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:46.557089 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:47.055763 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.055839 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:47.555908 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.555986 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.556307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.055720 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.556093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:49.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.056114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:49.056169 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:49.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.055833 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.055910 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.056293 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.555974 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.556043 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:51.056056 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.056128 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.056465 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:51.056513 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:51.556270 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.556344 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.556681 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.056466 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.056539 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.056895 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.555612 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.555693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.556206 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.055909 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.055981 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:53.556175 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:54.055792 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.056260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:54.556080 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.556156 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.556472 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.555651 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.556079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:56.056196 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:56.555837 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.555913 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.056033 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.056356 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.056107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.555780 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.555854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.556190 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:58.556260 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:59.055926 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.056011 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:59.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.556343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.556642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.059082 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.059161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.059514 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.556566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.556913 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:00.556965 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:01.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.055719 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:01.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.555754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:03.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.056098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:03.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:03.555672 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.056115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.556167 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.556265 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.556617 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:05.056442 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.056514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:05.056907 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:05.556597 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.556667 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.556997 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.556092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.055659 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.055728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.056019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:07.556123 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:08.055665 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.055743 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.056061 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:08.555631 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.555705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.055707 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.055787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.555670 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.556065 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:10.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.055794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:10.056268 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:10.555749 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.555819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.556146 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.055855 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:12.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:13.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:13.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.555886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.556252 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.055957 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.056026 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.056385 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.556596 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.556938 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:14.556992 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:15.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:15.555805 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.555887 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.055807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.555813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:17.055842 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.055915 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.056281 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:17.056342 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:17.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.555824 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.555898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.556258 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.055724 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.555983 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.556056 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.556402 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:19.556458 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:20.056180 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.056281 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.056653 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:20.556480 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.556560 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.056634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.056717 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.057043 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:22.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.055734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.056082 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:22.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:22.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.556259 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.055950 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.056030 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.056393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.555649 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.556074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.055763 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.556096 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.556167 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.556536 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:24.556589 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:25.056122 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.056197 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.056567 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:25.556331 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.556402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.556737 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.056615 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.056954 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.555650 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:27.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:27.056160 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:27.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.555829 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.055860 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.055937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.556069 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.556395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:29.056209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:29.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.055732 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.055808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.056154 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.555757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:31.055778 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.055856 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:31.056278 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:31.555669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.555744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.555701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:33.055858 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.056301 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:33.056353 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:33.556038 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.556445 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.056214 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.056311 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.056650 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.556604 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.556677 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.557012 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:35.056645 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.056718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.057052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:35.057102 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:35.555605 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.555680 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.556018 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.055752 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.055826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.056172 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.556126 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.055668 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.056096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.555797 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.555867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.556203 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:37.556272 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:38.055962 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.056083 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.056495 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:38.556271 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.556346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.556695 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.055905 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.055976 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.056296 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.556206 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:39.556792 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:40.056703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.056787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.057218 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:40.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.555750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:42.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.055860 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:42.056301 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:42.555953 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.556025 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.556420 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.555853 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.556039 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.556118 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:44.556541 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:45.055766 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.055855 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:45.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.555793 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:47.055949 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.056052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.056438 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:47.056488 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:47.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.556324 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.556696 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.056448 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.056853 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.556536 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.556604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.556871 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:49.056619 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.056692 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.057011 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:49.057072 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:49.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.556115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.055723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.555726 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.555805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:51.556259 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:52.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.056215 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:52.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.056106 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.555801 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.556202 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:53.556285 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:54.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.056407 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:54.556321 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.556398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.557023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.055680 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.555795 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.555866 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.556181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:56.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.055805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.056166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:56.056245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:56.555703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.555700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.556119 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:58.556172 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:59.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.055903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:59.555918 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.555995 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.056095 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.056184 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.056542 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.556340 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.556426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.556768 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:00.556820 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:01.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.056617 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.056937 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:01.555665 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.056048 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.056120 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.056471 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.556307 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.556375 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:03.056489 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.056566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.056907 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:03.056959 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:03.556548 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.556616 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.556947 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.055614 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.055691 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.056023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.556067 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.556168 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.055755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.056077 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.556113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:05.556171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:06.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.056074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:06.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.555767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.055795 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.555673 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.556083 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:08.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.055846 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.056205 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:08.056297 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:08.555965 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.556379 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.055815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.056111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.556064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:10.556121 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:11.055789 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:11.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.055985 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.555638 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.555713 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.556042 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:13.055626 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.055698 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.056031 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:13.056081 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:13.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.556147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.055908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:14.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.555886 1177669 node_ready.go:38] duration metric: took 6m0.000394955s for node "functional-240845" to be "Ready" ...
	I1218 00:27:14.559015 1177669 out.go:203] 
	W1218 00:27:14.562031 1177669 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:27:14.562056 1177669 out.go:285] * 
	* 
	W1218 00:27:14.564187 1177669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:27:14.567133 1177669 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-240845 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 7m41.100055872s for "functional-240845" cluster.
I1218 00:27:15.151105 1159552 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctional/serial/SoftStart]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctional/serial/SoftStart]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-240845
helpers_test.go:244: (dbg) docker inspect functional-240845:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	        "Created": "2025-12-18T00:18:49.336039923Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1175534,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:18:49.397861382Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hostname",
	        "HostsPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hosts",
	        "LogPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2-json.log",
	        "Name": "/functional-240845",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-240845:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-240845",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	                "LowerDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/merged",
	                "UpperDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/diff",
	                "WorkDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-240845",
	                "Source": "/var/lib/docker/volumes/functional-240845/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-240845",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-240845",
	                "name.minikube.sigs.k8s.io": "functional-240845",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "80ff640c2f3e079a9c83df8e9e88ea18985e04567ee70a1bf3deb87b69d7a9ef",
	            "SandboxKey": "/var/run/docker/netns/80ff640c2f3e",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33920"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33921"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33924"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33922"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33923"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-240845": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "f2:33:56:5f:da:77",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3f9ded1bec62ca4e0acc6643285f4a8aef2088de15bf9d1e6dbf478246c82ae7",
	                    "EndpointID": "a267c79a59d712dbf268b4db11b833499096e030f2777b578bf84c7f9519c961",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-240845",
	                        "5d3e3e2a238b"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845: exit status 2 (373.771534ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctional/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctional/serial/SoftStart]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 logs -n 25: (1.617809004s)
helpers_test.go:261: TestFunctional/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                          ARGS                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ addons  │ addons-399099 addons disable cloud-spanner --alsologtostderr -v=1                                                       │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:16 UTC │                     │
	│ ip      │ addons-399099 ip                                                                                                        │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ addons-399099 addons disable ingress-dns --alsologtostderr -v=1                                                         │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │                     │
	│ addons  │ addons-399099 addons disable ingress --alsologtostderr -v=1                                                             │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │                     │
	│ stop    │ -p addons-399099                                                                                                        │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ enable dashboard -p addons-399099                                                                                       │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ disable dashboard -p addons-399099                                                                                      │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ disable gvisor -p addons-399099                                                                                         │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ delete  │ -p addons-399099                                                                                                        │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ start   │ -p nospam-499800 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-499800 --driver=docker  --container-runtime=crio │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:18 UTC │
	│ start   │ nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ start   │ nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ start   │ nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                                      │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                                      │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                                      │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                                         │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                                         │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                                         │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ delete  │ -p nospam-499800                                                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ start   │ -p functional-240845 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio           │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:19 UTC │
	│ start   │ -p functional-240845 --alsologtostderr -v=8                                                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:19 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:19:34
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:19:34.105121 1177669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:19:34.105346 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105377 1177669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:19:34.105397 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105673 1177669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:19:34.106120 1177669 out.go:368] Setting JSON to false
	I1218 00:19:34.107069 1177669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25322,"bootTime":1765991852,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:19:34.107165 1177669 start.go:143] virtualization:  
	I1218 00:19:34.110567 1177669 out.go:179] * [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:19:34.114275 1177669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:19:34.114378 1177669 notify.go:221] Checking for updates...
	I1218 00:19:34.120029 1177669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:19:34.122925 1177669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:19:34.125751 1177669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:19:34.128638 1177669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:19:34.131461 1177669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:19:34.134887 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:34.134985 1177669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:19:34.159427 1177669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:19:34.159542 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.223972 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.214884618 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.224090 1177669 docker.go:319] overlay module found
	I1218 00:19:34.227220 1177669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:19:34.229963 1177669 start.go:309] selected driver: docker
	I1218 00:19:34.229985 1177669 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false D
isableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.230103 1177669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:19:34.230199 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.285040 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.2764408 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.285449 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:19:34.285507 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:19:34.285561 1177669 start.go:353] cluster config:
	{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetC
lientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.290381 1177669 out.go:179] * Starting "functional-240845" primary control-plane node in "functional-240845" cluster
	I1218 00:19:34.293210 1177669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:19:34.297960 1177669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:19:34.300783 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:19:34.300829 1177669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:19:34.300855 1177669 cache.go:65] Caching tarball of preloaded images
	I1218 00:19:34.300881 1177669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:19:34.300940 1177669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:19:34.300950 1177669 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:19:34.301056 1177669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/config.json ...
	I1218 00:19:34.320164 1177669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:19:34.320186 1177669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:19:34.320203 1177669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:19:34.320279 1177669 start.go:360] acquireMachinesLock for functional-240845: {Name:mk3ed718f4cde9dd7b19ef8d5bcd86c3175b5067 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:19:34.320350 1177669 start.go:364] duration metric: took 45.89µs to acquireMachinesLock for "functional-240845"
	I1218 00:19:34.320375 1177669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:19:34.320383 1177669 fix.go:54] fixHost starting: 
	I1218 00:19:34.320643 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:19:34.337200 1177669 fix.go:112] recreateIfNeeded on functional-240845: state=Running err=<nil>
	W1218 00:19:34.337231 1177669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:19:34.340534 1177669 out.go:252] * Updating the running docker "functional-240845" container ...
	I1218 00:19:34.340583 1177669 machine.go:94] provisionDockerMachine start ...
	I1218 00:19:34.340661 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.357593 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.357953 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.357966 1177669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:19:34.511862 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.511889 1177669 ubuntu.go:182] provisioning hostname "functional-240845"
	I1218 00:19:34.511951 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.530122 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.530421 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.530437 1177669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-240845 && echo "functional-240845" | sudo tee /etc/hostname
	I1218 00:19:34.693713 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.693796 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.711115 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.711437 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.711457 1177669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-240845' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-240845/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-240845' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:19:34.868676 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:19:34.868704 1177669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:19:34.868727 1177669 ubuntu.go:190] setting up certificates
	I1218 00:19:34.868737 1177669 provision.go:84] configureAuth start
	I1218 00:19:34.868796 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:34.885386 1177669 provision.go:143] copyHostCerts
	I1218 00:19:34.885436 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885473 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:19:34.885484 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885557 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:19:34.885647 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885670 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:19:34.885675 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885701 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:19:34.885784 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885802 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:19:34.885807 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885830 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:19:34.885882 1177669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-240845 san=[127.0.0.1 192.168.49.2 functional-240845 localhost minikube]
	I1218 00:19:35.070465 1177669 provision.go:177] copyRemoteCerts
	I1218 00:19:35.070558 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:19:35.070625 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.089175 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:35.196164 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:19:35.196247 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:19:35.213266 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:19:35.213323 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:19:35.231357 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:19:35.231416 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:19:35.249293 1177669 provision.go:87] duration metric: took 380.542312ms to configureAuth
	I1218 00:19:35.249372 1177669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:19:35.249565 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:35.249673 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.267176 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:35.267503 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:35.267526 1177669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:19:40.661888 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:19:40.661918 1177669 machine.go:97] duration metric: took 6.321326566s to provisionDockerMachine
	I1218 00:19:40.661929 1177669 start.go:293] postStartSetup for "functional-240845" (driver="docker")
	I1218 00:19:40.661947 1177669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:19:40.662006 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:19:40.662069 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.679665 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.787680 1177669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:19:40.790725 1177669 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:19:40.790745 1177669 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:19:40.790750 1177669 command_runner.go:130] > VERSION_ID="12"
	I1218 00:19:40.790757 1177669 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:19:40.790762 1177669 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:19:40.790766 1177669 command_runner.go:130] > ID=debian
	I1218 00:19:40.790771 1177669 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:19:40.790776 1177669 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:19:40.790785 1177669 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:19:40.790821 1177669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:19:40.790843 1177669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:19:40.790853 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:19:40.790906 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:19:40.790988 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:19:40.791003 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:19:40.791081 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:19:40.791089 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:19:40.791141 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:19:40.798177 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:19:40.814786 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:19:40.830892 1177669 start.go:296] duration metric: took 168.948549ms for postStartSetup
	I1218 00:19:40.831030 1177669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:19:40.831082 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.848091 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.952833 1177669 command_runner.go:130] > 13%
	I1218 00:19:40.953354 1177669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:19:40.957853 1177669 command_runner.go:130] > 171G
	I1218 00:19:40.958309 1177669 fix.go:56] duration metric: took 6.637921757s for fixHost
	I1218 00:19:40.958329 1177669 start.go:83] releasing machines lock for "functional-240845", held for 6.637966499s
	I1218 00:19:40.958394 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:40.975843 1177669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:19:40.975911 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.976173 1177669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:19:40.976254 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.995610 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.013560 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.099878 1177669 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:19:41.100025 1177669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:19:41.195326 1177669 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:19:41.198525 1177669 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:19:41.198598 1177669 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:19:41.198697 1177669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:19:41.321255 1177669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:19:41.326138 1177669 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:19:41.326216 1177669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:19:41.326312 1177669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:19:41.337406 1177669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:19:41.337470 1177669 start.go:496] detecting cgroup driver to use...
	I1218 00:19:41.337517 1177669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:19:41.337604 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:19:41.364732 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:19:41.395259 1177669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:19:41.395373 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:19:41.425216 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:19:41.453795 1177669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:19:41.688599 1177669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:19:41.909163 1177669 docker.go:234] disabling docker service ...
	I1218 00:19:41.909312 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:19:41.926883 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:19:41.943387 1177669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:19:42.156451 1177669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:19:42.449825 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:19:42.467750 1177669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:19:42.493864 1177669 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:19:42.495463 1177669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:19:42.495560 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.506971 1177669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:19:42.507118 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.518977 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.530876 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.539925 1177669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:19:42.553447 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.569558 1177669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.582698 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.597525 1177669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:19:42.608606 1177669 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:19:42.609612 1177669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:19:42.617962 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:19:42.846451 1177669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:21:13.130293 1177669 ssh_runner.go:235] Completed: sudo systemctl restart crio: (1m30.283808536s)
	I1218 00:21:13.130318 1177669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:21:13.130368 1177669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:21:13.134416 1177669 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:21:13.134438 1177669 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:21:13.134453 1177669 command_runner.go:130] > Device: 0,72	Inode: 804         Links: 1
	I1218 00:21:13.134460 1177669 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:13.134465 1177669 command_runner.go:130] > Access: 2025-12-18 00:21:13.087402358 +0000
	I1218 00:21:13.134471 1177669 command_runner.go:130] > Modify: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134475 1177669 command_runner.go:130] > Change: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134479 1177669 command_runner.go:130] >  Birth: -
	I1218 00:21:13.134836 1177669 start.go:564] Will wait 60s for crictl version
	I1218 00:21:13.134895 1177669 ssh_runner.go:195] Run: which crictl
	I1218 00:21:13.138647 1177669 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:21:13.138725 1177669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:21:13.167266 1177669 command_runner.go:130] > Version:  0.1.0
	I1218 00:21:13.167284 1177669 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:21:13.167289 1177669 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:21:13.167294 1177669 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:21:13.169251 1177669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:21:13.169347 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.194596 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.194618 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.194624 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.194629 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.194634 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.194639 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.194643 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.194656 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.194660 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.194671 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.194674 1177669 command_runner.go:130] >      static
	I1218 00:21:13.194678 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.194682 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.194686 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.194689 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.194693 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.194697 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.194701 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.194705 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.194709 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.196349 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.221274 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.221297 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.221302 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.221308 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.221313 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.221318 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.221321 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.221326 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.221331 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.221334 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.221338 1177669 command_runner.go:130] >      static
	I1218 00:21:13.221341 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.221345 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.221350 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.221353 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.221357 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.221360 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.221364 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.221369 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.221373 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.226046 1177669 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:21:13.228983 1177669 cli_runner.go:164] Run: docker network inspect functional-240845 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:21:13.244579 1177669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:21:13.248178 1177669 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:21:13.248440 1177669 kubeadm.go:884] updating cluster {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA API
ServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fal
se DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:21:13.248553 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:21:13.248613 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.282229 1177669 command_runner.go:130] > {
	I1218 00:21:13.282251 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.282256 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282265 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.282269 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282275 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.282279 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282283 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282294 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.282305 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.282308 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282313 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.282332 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282342 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282346 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282350 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282356 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.282364 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282370 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.282373 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282378 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282389 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.282398 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.282403 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282408 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.282415 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282422 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282426 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282434 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282444 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.282449 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282454 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.282462 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282466 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282475 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.282483 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.282491 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282495 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.282499 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282503 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282506 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282509 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282516 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.282523 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282528 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.282532 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282536 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282549 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.282557 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.282564 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282569 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.282578 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.282586 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282589 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282592 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282599 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.282606 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282611 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.282615 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282624 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282631 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.282643 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.282647 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282651 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.282658 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282661 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282665 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282669 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282676 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282680 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282698 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282709 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.282714 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282719 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.282726 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282729 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282737 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.282746 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.282751 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282755 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.282759 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282765 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282769 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282777 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282782 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282785 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282788 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282795 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.282802 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282807 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.282811 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282815 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282828 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.282836 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.282843 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282850 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.282853 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282862 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282865 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282869 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282873 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282883 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282887 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282894 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.282902 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282907 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.282910 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282913 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282922 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.282941 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.282952 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282957 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.282967 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282970 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282973 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282976 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282984 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.282999 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283004 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.283007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283010 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283018 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.283026 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.283029 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283036 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.283040 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283046 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.283054 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283061 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283065 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.283067 1177669 command_runner.go:130] >     },
	I1218 00:21:13.283071 1177669 command_runner.go:130] >     {
	I1218 00:21:13.283079 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.283084 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283089 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.283092 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283099 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283107 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.283116 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.283122 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283126 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.283129 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283133 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.283136 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283144 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283148 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.283155 1177669 command_runner.go:130] >     }
	I1218 00:21:13.283158 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.283161 1177669 command_runner.go:130] > }
	I1218 00:21:13.283336 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.283347 1177669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:21:13.283410 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.307800 1177669 command_runner.go:130] > {
	I1218 00:21:13.307819 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.307823 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307831 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.307836 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307841 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.307845 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307849 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307861 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.307869 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.307872 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307877 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.307881 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307886 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307889 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307893 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307899 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.307903 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307909 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.307912 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307921 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307929 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.307940 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.307943 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307947 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.307951 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307959 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307962 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307965 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307971 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.307975 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307980 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.307983 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307987 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307995 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.308003 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.308007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308011 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.308015 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308020 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308023 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308026 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308032 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.308036 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308042 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.308045 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308049 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308057 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.308065 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.308068 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308072 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.308080 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.308084 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308087 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308090 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308099 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.308103 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308108 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.308111 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308114 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308122 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.308129 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.308132 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308136 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.308140 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308143 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308146 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308149 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308153 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308156 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308159 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308165 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.308168 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308173 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.308176 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308180 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308188 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.308195 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.308198 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308202 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.308206 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308210 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308213 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308217 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308241 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308244 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308247 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308253 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.308262 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308269 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.308275 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308279 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308287 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.308295 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.308298 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308302 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.308306 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308309 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308312 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308316 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308319 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308323 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308325 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308332 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.308335 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308340 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.308343 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308347 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308354 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.308370 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.308374 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308377 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.308381 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308385 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308387 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308390 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308397 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.308400 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308405 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.308408 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308412 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308422 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.308430 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.308433 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308437 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.308440 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308444 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308447 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308450 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308455 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308458 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308461 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308468 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.308472 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308477 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.308480 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308484 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308491 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.308498 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.308501 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308505 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.308508 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308512 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.308515 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308518 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308522 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.308524 1177669 command_runner.go:130] >     }
	I1218 00:21:13.308527 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.308529 1177669 command_runner.go:130] > }
	I1218 00:21:13.310403 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.310424 1177669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:21:13.310432 1177669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.34.3 crio true true} ...
	I1218 00:21:13.310536 1177669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-240845 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:21:13.310619 1177669 ssh_runner.go:195] Run: crio config
	I1218 00:21:13.358161 1177669 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:21:13.358186 1177669 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:21:13.358194 1177669 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:21:13.358198 1177669 command_runner.go:130] > #
	I1218 00:21:13.358205 1177669 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:21:13.358212 1177669 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:21:13.358218 1177669 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:21:13.358229 1177669 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:21:13.358236 1177669 command_runner.go:130] > # reload'.
	I1218 00:21:13.358243 1177669 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:21:13.358250 1177669 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:21:13.358258 1177669 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:21:13.358264 1177669 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:21:13.358267 1177669 command_runner.go:130] > [crio]
	I1218 00:21:13.358273 1177669 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:21:13.358277 1177669 command_runner.go:130] > # containers images, in this directory.
	I1218 00:21:13.358820 1177669 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:21:13.358837 1177669 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:21:13.359435 1177669 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:21:13.359448 1177669 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:21:13.359935 1177669 command_runner.go:130] > # imagestore = ""
	I1218 00:21:13.359950 1177669 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:21:13.359963 1177669 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:21:13.360646 1177669 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:21:13.360660 1177669 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:21:13.360667 1177669 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:21:13.360961 1177669 command_runner.go:130] > # storage_option = [
	I1218 00:21:13.361308 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.361321 1177669 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:21:13.361334 1177669 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:21:13.361921 1177669 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:21:13.361934 1177669 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:21:13.361949 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:21:13.361954 1177669 command_runner.go:130] > # always happen on a node reboot
	I1218 00:21:13.362559 1177669 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:21:13.362583 1177669 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:21:13.362590 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:21:13.362595 1177669 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:21:13.363052 1177669 command_runner.go:130] > # version_file_persist = ""
	I1218 00:21:13.363067 1177669 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:21:13.363076 1177669 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:21:13.363680 1177669 command_runner.go:130] > # internal_wipe = true
	I1218 00:21:13.363702 1177669 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:21:13.363709 1177669 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:21:13.364349 1177669 command_runner.go:130] > # internal_repair = true
	I1218 00:21:13.364361 1177669 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:21:13.364368 1177669 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:21:13.364377 1177669 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:21:13.364926 1177669 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:21:13.364942 1177669 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:21:13.364946 1177669 command_runner.go:130] > [crio.api]
	I1218 00:21:13.364951 1177669 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:21:13.365581 1177669 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:21:13.365594 1177669 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:21:13.367685 1177669 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:21:13.367700 1177669 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:21:13.367706 1177669 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:21:13.367710 1177669 command_runner.go:130] > # stream_port = "0"
	I1218 00:21:13.367716 1177669 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:21:13.367723 1177669 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:21:13.367730 1177669 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:21:13.367745 1177669 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:21:13.367752 1177669 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:21:13.367762 1177669 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367766 1177669 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:21:13.367773 1177669 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:21:13.367780 1177669 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367784 1177669 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:21:13.367791 1177669 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:21:13.367802 1177669 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:21:13.367808 1177669 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:21:13.367814 1177669 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:21:13.367835 1177669 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367844 1177669 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:21:13.367853 1177669 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367861 1177669 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:21:13.367868 1177669 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:21:13.367879 1177669 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:21:13.367883 1177669 command_runner.go:130] > [crio.runtime]
	I1218 00:21:13.367893 1177669 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:21:13.367904 1177669 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:21:13.367916 1177669 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:21:13.367926 1177669 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:21:13.367934 1177669 command_runner.go:130] > # default_ulimits = [
	I1218 00:21:13.367937 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.367950 1177669 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:21:13.367958 1177669 command_runner.go:130] > # no_pivot = false
	I1218 00:21:13.367963 1177669 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:21:13.367974 1177669 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:21:13.367979 1177669 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:21:13.367988 1177669 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:21:13.367994 1177669 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:21:13.368004 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368012 1177669 command_runner.go:130] > # conmon = ""
	I1218 00:21:13.368015 1177669 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:21:13.368023 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:21:13.368026 1177669 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:21:13.368035 1177669 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:21:13.368044 1177669 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:21:13.368051 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368058 1177669 command_runner.go:130] > # conmon_env = [
	I1218 00:21:13.368061 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368070 1177669 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:21:13.368076 1177669 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:21:13.368084 1177669 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:21:13.368089 1177669 command_runner.go:130] > # default_env = [
	I1218 00:21:13.368092 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368098 1177669 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:21:13.368111 1177669 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:21:13.368119 1177669 command_runner.go:130] > # selinux = false
	I1218 00:21:13.368125 1177669 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:21:13.368136 1177669 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:21:13.368144 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368148 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.368159 1177669 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:21:13.368167 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368171 1177669 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:21:13.368178 1177669 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:21:13.368189 1177669 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:21:13.368199 1177669 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:21:13.368206 1177669 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:21:13.368212 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368217 1177669 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:21:13.368256 1177669 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:21:13.368261 1177669 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:21:13.368266 1177669 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:21:13.368280 1177669 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:21:13.368287 1177669 command_runner.go:130] > # blockio parameters.
	I1218 00:21:13.368292 1177669 command_runner.go:130] > # blockio_reload = false
	I1218 00:21:13.368298 1177669 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:21:13.368303 1177669 command_runner.go:130] > # irqbalance daemon.
	I1218 00:21:13.368311 1177669 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:21:13.368320 1177669 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:21:13.368327 1177669 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:21:13.368337 1177669 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:21:13.368347 1177669 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:21:13.368357 1177669 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:21:13.368365 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368369 1177669 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:21:13.368375 1177669 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:21:13.368382 1177669 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:21:13.368388 1177669 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:21:13.368396 1177669 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:21:13.368402 1177669 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:21:13.368412 1177669 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:21:13.368419 1177669 command_runner.go:130] > # will be added.
	I1218 00:21:13.368423 1177669 command_runner.go:130] > # default_capabilities = [
	I1218 00:21:13.368430 1177669 command_runner.go:130] > # 	"CHOWN",
	I1218 00:21:13.368434 1177669 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:21:13.368442 1177669 command_runner.go:130] > # 	"FSETID",
	I1218 00:21:13.368445 1177669 command_runner.go:130] > # 	"FOWNER",
	I1218 00:21:13.368457 1177669 command_runner.go:130] > # 	"SETGID",
	I1218 00:21:13.368461 1177669 command_runner.go:130] > # 	"SETUID",
	I1218 00:21:13.368479 1177669 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:21:13.368487 1177669 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:21:13.368490 1177669 command_runner.go:130] > # 	"KILL",
	I1218 00:21:13.368494 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368506 1177669 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:21:13.368515 1177669 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:21:13.368524 1177669 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:21:13.368531 1177669 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:21:13.368539 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368542 1177669 command_runner.go:130] > default_sysctls = [
	I1218 00:21:13.368547 1177669 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:21:13.368554 1177669 command_runner.go:130] > ]
	I1218 00:21:13.368563 1177669 command_runner.go:130] > # List of devices on the host that a
	I1218 00:21:13.368570 1177669 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:21:13.368577 1177669 command_runner.go:130] > # allowed_devices = [
	I1218 00:21:13.368580 1177669 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:21:13.368588 1177669 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:21:13.368594 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368603 1177669 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:21:13.368611 1177669 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:21:13.368618 1177669 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:21:13.368624 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368628 1177669 command_runner.go:130] > # additional_devices = [
	I1218 00:21:13.368633 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368639 1177669 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:21:13.368646 1177669 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:21:13.368649 1177669 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:21:13.368653 1177669 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:21:13.368664 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368673 1177669 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:21:13.368683 1177669 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:21:13.368701 1177669 command_runner.go:130] > # Defaults to false.
	I1218 00:21:13.368712 1177669 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:21:13.368719 1177669 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:21:13.368725 1177669 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:21:13.368734 1177669 command_runner.go:130] > # hooks_dir = [
	I1218 00:21:13.368739 1177669 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:21:13.368745 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368751 1177669 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:21:13.368761 1177669 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:21:13.368770 1177669 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:21:13.368773 1177669 command_runner.go:130] > #
	I1218 00:21:13.368780 1177669 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:21:13.368789 1177669 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:21:13.368795 1177669 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:21:13.368803 1177669 command_runner.go:130] > #
	I1218 00:21:13.368809 1177669 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:21:13.368818 1177669 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:21:13.368829 1177669 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:21:13.368846 1177669 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:21:13.368853 1177669 command_runner.go:130] > #
	I1218 00:21:13.368857 1177669 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:21:13.368866 1177669 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:21:13.368876 1177669 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:21:13.368880 1177669 command_runner.go:130] > # pids_limit = -1
	I1218 00:21:13.368886 1177669 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:21:13.368894 1177669 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:21:13.368904 1177669 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:21:13.368917 1177669 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:21:13.368923 1177669 command_runner.go:130] > # log_size_max = -1
	I1218 00:21:13.368931 1177669 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:21:13.368938 1177669 command_runner.go:130] > # log_to_journald = false
	I1218 00:21:13.368944 1177669 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:21:13.368949 1177669 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:21:13.368959 1177669 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:21:13.368968 1177669 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:21:13.368974 1177669 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:21:13.368981 1177669 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:21:13.368986 1177669 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:21:13.368993 1177669 command_runner.go:130] > # read_only = false
	I1218 00:21:13.369000 1177669 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:21:13.369009 1177669 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:21:13.369017 1177669 command_runner.go:130] > # live configuration reload.
	I1218 00:21:13.369020 1177669 command_runner.go:130] > # log_level = "info"
	I1218 00:21:13.369026 1177669 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:21:13.369031 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.369036 1177669 command_runner.go:130] > # log_filter = ""
	I1218 00:21:13.369043 1177669 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369052 1177669 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:21:13.369056 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369067 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369074 1177669 command_runner.go:130] > # uid_mappings = ""
	I1218 00:21:13.369084 1177669 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369093 1177669 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:21:13.369097 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369105 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369114 1177669 command_runner.go:130] > # gid_mappings = ""
	I1218 00:21:13.369120 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:21:13.369127 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369139 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369150 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369158 1177669 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:21:13.369165 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:21:13.369174 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369184 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369192 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369196 1177669 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:21:13.369208 1177669 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:21:13.369218 1177669 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:21:13.369224 1177669 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:21:13.369231 1177669 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:21:13.369238 1177669 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:21:13.369247 1177669 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:21:13.369256 1177669 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:21:13.369261 1177669 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:21:13.369265 1177669 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:21:13.369273 1177669 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:21:13.369279 1177669 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:21:13.369286 1177669 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:21:13.369293 1177669 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:21:13.369301 1177669 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:21:13.369310 1177669 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:21:13.369320 1177669 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:21:13.369326 1177669 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:21:13.369333 1177669 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:21:13.369339 1177669 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:21:13.369347 1177669 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:21:13.369351 1177669 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:21:13.369359 1177669 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:21:13.369363 1177669 command_runner.go:130] > # pinns_path = ""
	I1218 00:21:13.369368 1177669 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:21:13.369378 1177669 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:21:13.369382 1177669 command_runner.go:130] > # enable_criu_support = true
	I1218 00:21:13.369390 1177669 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:21:13.369400 1177669 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:21:13.369407 1177669 command_runner.go:130] > # enable_pod_events = false
	I1218 00:21:13.369414 1177669 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:21:13.369422 1177669 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:21:13.369426 1177669 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:21:13.369431 1177669 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:21:13.369443 1177669 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:21:13.369457 1177669 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:21:13.369465 1177669 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:21:13.369474 1177669 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:21:13.369481 1177669 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:21:13.369486 1177669 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:21:13.369492 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.369499 1177669 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:21:13.369509 1177669 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:21:13.369515 1177669 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:21:13.369521 1177669 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:21:13.369523 1177669 command_runner.go:130] > #
	I1218 00:21:13.369528 1177669 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:21:13.369536 1177669 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:21:13.369540 1177669 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:21:13.369548 1177669 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:21:13.369553 1177669 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:21:13.369561 1177669 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:21:13.369565 1177669 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:21:13.369574 1177669 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:21:13.369577 1177669 command_runner.go:130] > # monitor_env = []
	I1218 00:21:13.369585 1177669 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:21:13.369590 1177669 command_runner.go:130] > # allowed_annotations = []
	I1218 00:21:13.369595 1177669 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:21:13.369599 1177669 command_runner.go:130] > # no_sync_log = false
	I1218 00:21:13.369603 1177669 command_runner.go:130] > # default_annotations = {}
	I1218 00:21:13.369611 1177669 command_runner.go:130] > # stream_websockets = false
	I1218 00:21:13.369614 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.369664 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.369673 1177669 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:21:13.369680 1177669 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:21:13.369686 1177669 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:21:13.369697 1177669 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:21:13.369708 1177669 command_runner.go:130] > #   in $PATH.
	I1218 00:21:13.369718 1177669 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:21:13.369728 1177669 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:21:13.369735 1177669 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:21:13.369741 1177669 command_runner.go:130] > #   state.
	I1218 00:21:13.369747 1177669 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:21:13.369753 1177669 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:21:13.369759 1177669 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:21:13.369765 1177669 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:21:13.369774 1177669 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:21:13.369780 1177669 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:21:13.369789 1177669 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:21:13.369795 1177669 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:21:13.369805 1177669 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:21:13.369813 1177669 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:21:13.369820 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:21:13.369831 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:21:13.370100 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:21:13.370120 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:21:13.370129 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:21:13.370143 1177669 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:21:13.370151 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:21:13.370162 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:21:13.370169 1177669 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:21:13.370176 1177669 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:21:13.370187 1177669 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:21:13.370195 1177669 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:21:13.370206 1177669 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:21:13.370213 1177669 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:21:13.370219 1177669 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:21:13.370232 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:21:13.370239 1177669 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:21:13.370249 1177669 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:21:13.370266 1177669 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:21:13.370271 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:21:13.370283 1177669 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:21:13.370288 1177669 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:21:13.370295 1177669 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:21:13.370305 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:21:13.370313 1177669 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:21:13.370317 1177669 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:21:13.370329 1177669 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:21:13.370338 1177669 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:21:13.370350 1177669 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:21:13.370357 1177669 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:21:13.370367 1177669 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:21:13.370375 1177669 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:21:13.370388 1177669 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:21:13.370395 1177669 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:21:13.370408 1177669 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:21:13.370420 1177669 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:21:13.370425 1177669 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:21:13.370437 1177669 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:21:13.370445 1177669 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:21:13.370459 1177669 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:21:13.370466 1177669 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:21:13.370473 1177669 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:21:13.370485 1177669 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:21:13.370488 1177669 command_runner.go:130] > #
	I1218 00:21:13.370493 1177669 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:21:13.370496 1177669 command_runner.go:130] > #
	I1218 00:21:13.370506 1177669 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:21:13.370513 1177669 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:21:13.370516 1177669 command_runner.go:130] > #
	I1218 00:21:13.370525 1177669 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:21:13.370537 1177669 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:21:13.370545 1177669 command_runner.go:130] > #
	I1218 00:21:13.370553 1177669 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:21:13.370556 1177669 command_runner.go:130] > # feature.
	I1218 00:21:13.370563 1177669 command_runner.go:130] > #
	I1218 00:21:13.370569 1177669 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:21:13.370576 1177669 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:21:13.370587 1177669 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:21:13.370594 1177669 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:21:13.370600 1177669 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:21:13.370610 1177669 command_runner.go:130] > #
	I1218 00:21:13.370618 1177669 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:21:13.370625 1177669 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:21:13.370628 1177669 command_runner.go:130] > #
	I1218 00:21:13.370638 1177669 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:21:13.370644 1177669 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:21:13.370647 1177669 command_runner.go:130] > #
	I1218 00:21:13.370657 1177669 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:21:13.370664 1177669 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:21:13.370667 1177669 command_runner.go:130] > # limitation.
	I1218 00:21:13.370672 1177669 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:21:13.370680 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:21:13.370684 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.370688 1177669 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:21:13.370695 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.370699 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371091 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371100 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371106 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371111 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371151 1177669 command_runner.go:130] > allowed_annotations = [
	I1218 00:21:13.371159 1177669 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:21:13.371163 1177669 command_runner.go:130] > ]
	I1218 00:21:13.371167 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371172 1177669 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:21:13.371180 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:21:13.371184 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.371188 1177669 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:21:13.371224 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.371229 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371233 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371242 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371253 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371257 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371263 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371305 1177669 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:21:13.371314 1177669 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:21:13.371321 1177669 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:21:13.371342 1177669 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:21:13.371388 1177669 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:21:13.371402 1177669 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:21:13.371414 1177669 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:21:13.371421 1177669 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:21:13.371470 1177669 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:21:13.371479 1177669 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:21:13.371490 1177669 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:21:13.371528 1177669 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:21:13.371536 1177669 command_runner.go:130] > # Example:
	I1218 00:21:13.371546 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:21:13.371551 1177669 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:21:13.371556 1177669 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:21:13.371561 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:21:13.371569 1177669 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:21:13.371573 1177669 command_runner.go:130] > # cpushares = "5"
	I1218 00:21:13.371606 1177669 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:21:13.371613 1177669 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:21:13.371617 1177669 command_runner.go:130] > # cpulimit = "35"
	I1218 00:21:13.371620 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.371629 1177669 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:21:13.371636 1177669 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:21:13.371647 1177669 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:21:13.371690 1177669 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:21:13.371702 1177669 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:21:13.371713 1177669 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:21:13.371718 1177669 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:21:13.371726 1177669 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:21:13.371777 1177669 command_runner.go:130] > # Default value is set to true
	I1218 00:21:13.371785 1177669 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:21:13.371791 1177669 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:21:13.371796 1177669 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:21:13.371805 1177669 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:21:13.371846 1177669 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:21:13.371855 1177669 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:21:13.371869 1177669 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:21:13.371873 1177669 command_runner.go:130] > # timezone = ""
	I1218 00:21:13.371880 1177669 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:21:13.371883 1177669 command_runner.go:130] > #
	I1218 00:21:13.371923 1177669 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:21:13.371933 1177669 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:21:13.371937 1177669 command_runner.go:130] > [crio.image]
	I1218 00:21:13.371948 1177669 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:21:13.371953 1177669 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:21:13.371960 1177669 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:21:13.372001 1177669 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372008 1177669 command_runner.go:130] > # global_auth_file = ""
	I1218 00:21:13.372014 1177669 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:21:13.372020 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372029 1177669 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.372036 1177669 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:21:13.372043 1177669 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372052 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372057 1177669 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:21:13.372094 1177669 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:21:13.372111 1177669 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:21:13.372119 1177669 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:21:13.372125 1177669 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:21:13.372134 1177669 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:21:13.372140 1177669 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:21:13.372147 1177669 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:21:13.372187 1177669 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:21:13.372197 1177669 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:21:13.372204 1177669 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:21:13.372215 1177669 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:21:13.372270 1177669 command_runner.go:130] > # pinned_images = [
	I1218 00:21:13.372283 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372290 1177669 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:21:13.372301 1177669 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:21:13.372308 1177669 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:21:13.372319 1177669 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:21:13.372324 1177669 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:21:13.372362 1177669 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:21:13.372371 1177669 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:21:13.372384 1177669 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:21:13.372391 1177669 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:21:13.372402 1177669 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:21:13.372408 1177669 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:21:13.372414 1177669 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:21:13.372450 1177669 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:21:13.372460 1177669 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:21:13.372464 1177669 command_runner.go:130] > # changing them here.
	I1218 00:21:13.372475 1177669 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:21:13.372479 1177669 command_runner.go:130] > # insecure_registries = [
	I1218 00:21:13.372482 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372489 1177669 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:21:13.372498 1177669 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:21:13.372502 1177669 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:21:13.372541 1177669 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:21:13.372549 1177669 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:21:13.372559 1177669 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:21:13.372567 1177669 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:21:13.372572 1177669 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:21:13.372582 1177669 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:21:13.372591 1177669 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:21:13.372630 1177669 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:21:13.372638 1177669 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:21:13.372643 1177669 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:21:13.372650 1177669 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:21:13.372667 1177669 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:21:13.372672 1177669 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:21:13.372678 1177669 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:21:13.372721 1177669 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:21:13.372730 1177669 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:21:13.372735 1177669 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:21:13.372746 1177669 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:21:13.372750 1177669 command_runner.go:130] > # CNI plugins.
	I1218 00:21:13.372753 1177669 command_runner.go:130] > [crio.network]
	I1218 00:21:13.372759 1177669 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:21:13.372769 1177669 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:21:13.372773 1177669 command_runner.go:130] > # cni_default_network = ""
	I1218 00:21:13.372780 1177669 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:21:13.372837 1177669 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:21:13.372851 1177669 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:21:13.372856 1177669 command_runner.go:130] > # plugin_dirs = [
	I1218 00:21:13.372860 1177669 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:21:13.372863 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372867 1177669 command_runner.go:130] > # List of included pod metrics.
	I1218 00:21:13.372903 1177669 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:21:13.372909 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372923 1177669 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:21:13.372927 1177669 command_runner.go:130] > [crio.metrics]
	I1218 00:21:13.372933 1177669 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:21:13.372941 1177669 command_runner.go:130] > # enable_metrics = false
	I1218 00:21:13.372946 1177669 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:21:13.372951 1177669 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:21:13.372958 1177669 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:21:13.372999 1177669 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:21:13.373006 1177669 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:21:13.373010 1177669 command_runner.go:130] > # metrics_collectors = [
	I1218 00:21:13.373018 1177669 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:21:13.373023 1177669 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:21:13.373033 1177669 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:21:13.373037 1177669 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:21:13.373042 1177669 command_runner.go:130] > # 	"operations_total",
	I1218 00:21:13.373077 1177669 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:21:13.373084 1177669 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:21:13.373089 1177669 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:21:13.373093 1177669 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:21:13.373098 1177669 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:21:13.373106 1177669 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:21:13.373111 1177669 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:21:13.373115 1177669 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:21:13.373120 1177669 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:21:13.373133 1177669 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:21:13.373167 1177669 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:21:13.373176 1177669 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:21:13.373179 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.373190 1177669 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:21:13.373199 1177669 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:21:13.373205 1177669 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:21:13.373209 1177669 command_runner.go:130] > # metrics_port = 9090
	I1218 00:21:13.373214 1177669 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:21:13.373222 1177669 command_runner.go:130] > # metrics_socket = ""
	I1218 00:21:13.373425 1177669 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:21:13.373436 1177669 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:21:13.373448 1177669 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:21:13.373454 1177669 command_runner.go:130] > # certificate on any modification event.
	I1218 00:21:13.373457 1177669 command_runner.go:130] > # metrics_cert = ""
	I1218 00:21:13.373463 1177669 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:21:13.373472 1177669 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:21:13.373475 1177669 command_runner.go:130] > # metrics_key = ""
	I1218 00:21:13.373510 1177669 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:21:13.373518 1177669 command_runner.go:130] > [crio.tracing]
	I1218 00:21:13.373528 1177669 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:21:13.373538 1177669 command_runner.go:130] > # enable_tracing = false
	I1218 00:21:13.373545 1177669 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:21:13.373549 1177669 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:21:13.373560 1177669 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:21:13.373565 1177669 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:21:13.373569 1177669 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:21:13.373602 1177669 command_runner.go:130] > [crio.nri]
	I1218 00:21:13.373606 1177669 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:21:13.373614 1177669 command_runner.go:130] > # enable_nri = true
	I1218 00:21:13.373618 1177669 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:21:13.373623 1177669 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:21:13.373628 1177669 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:21:13.373632 1177669 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:21:13.373641 1177669 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:21:13.373646 1177669 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:21:13.373652 1177669 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:21:13.374323 1177669 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:21:13.374347 1177669 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:21:13.374353 1177669 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:21:13.374359 1177669 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:21:13.374369 1177669 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:21:13.374374 1177669 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:21:13.374384 1177669 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:21:13.374396 1177669 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:21:13.374400 1177669 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:21:13.374404 1177669 command_runner.go:130] > # - OCI hook injection
	I1218 00:21:13.374410 1177669 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:21:13.374419 1177669 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:21:13.374424 1177669 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:21:13.374429 1177669 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:21:13.374440 1177669 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:21:13.374447 1177669 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:21:13.374453 1177669 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:21:13.374461 1177669 command_runner.go:130] > #
	I1218 00:21:13.374470 1177669 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:21:13.374475 1177669 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:21:13.374481 1177669 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:21:13.374487 1177669 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:21:13.374497 1177669 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:21:13.374503 1177669 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:21:13.374508 1177669 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:21:13.374517 1177669 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:21:13.374520 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.374526 1177669 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:21:13.374532 1177669 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:21:13.374540 1177669 command_runner.go:130] > [crio.stats]
	I1218 00:21:13.374546 1177669 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:21:13.374552 1177669 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:21:13.374557 1177669 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:21:13.374567 1177669 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:21:13.374574 1177669 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:21:13.374578 1177669 command_runner.go:130] > # collection_period = 0
	I1218 00:21:13.375235 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337716712Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:21:13.375252 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337755529Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:21:13.375261 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337787676Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:21:13.375269 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337813217Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:21:13.375279 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337887603Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:21:13.375295 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.338323059Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:21:13.375307 1177669 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:21:13.375636 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:21:13.375654 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:21:13.375670 1177669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:21:13.375692 1177669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-240845 NodeName:functional-240845 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:21:13.375818 1177669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-240845"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:21:13.375897 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:21:13.382943 1177669 command_runner.go:130] > kubeadm
	I1218 00:21:13.382987 1177669 command_runner.go:130] > kubectl
	I1218 00:21:13.382992 1177669 command_runner.go:130] > kubelet
	I1218 00:21:13.383228 1177669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:21:13.383323 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:21:13.390563 1177669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (367 bytes)
	I1218 00:21:13.402469 1177669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:21:13.415695 1177669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2214 bytes)
	I1218 00:21:13.427935 1177669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:21:13.431432 1177669 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:21:13.431528 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:13.573724 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:13.587283 1177669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845 for IP: 192.168.49.2
	I1218 00:21:13.587308 1177669 certs.go:195] generating shared ca certs ...
	I1218 00:21:13.587325 1177669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:13.587468 1177669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:21:13.587523 1177669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:21:13.587535 1177669 certs.go:257] generating profile certs ...
	I1218 00:21:13.587627 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key
	I1218 00:21:13.587682 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key.83c30509
	I1218 00:21:13.587749 1177669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key
	I1218 00:21:13.587763 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:21:13.587778 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:21:13.587791 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:21:13.587807 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:21:13.587827 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:21:13.587840 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:21:13.587855 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:21:13.587866 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:21:13.587928 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:21:13.587965 1177669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:21:13.587976 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:21:13.588004 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:21:13.588031 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:21:13.588058 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:21:13.588108 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:21:13.588142 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.588156 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.588167 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.588757 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:21:13.607287 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:21:13.626005 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:21:13.643497 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:21:13.660653 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:21:13.677616 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1218 00:21:13.694313 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:21:13.711161 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:21:13.728011 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:21:13.745006 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:21:13.761771 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:21:13.778664 1177669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:21:13.791171 1177669 ssh_runner.go:195] Run: openssl version
	I1218 00:21:13.796833 1177669 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:21:13.797285 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.804618 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:21:13.812913 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816610 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816655 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816704 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.857240 1177669 command_runner.go:130] > 51391683
	I1218 00:21:13.857318 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:21:13.864756 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.871981 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:21:13.879459 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883023 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883055 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883126 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.923479 1177669 command_runner.go:130] > 3ec20f2e
	I1218 00:21:13.923967 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:21:13.931505 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.938743 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:21:13.946369 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950234 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950276 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950327 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.990419 1177669 command_runner.go:130] > b5213941
	I1218 00:21:13.990837 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:21:13.998401 1177669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003376 1177669 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003402 1177669 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:21:14.003409 1177669 command_runner.go:130] > Device: 259,1	Inode: 1327743     Links: 1
	I1218 00:21:14.003416 1177669 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:14.003422 1177669 command_runner.go:130] > Access: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003427 1177669 command_runner.go:130] > Modify: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003432 1177669 command_runner.go:130] > Change: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003438 1177669 command_runner.go:130] >  Birth: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003512 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:21:14.045243 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.045691 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:21:14.086658 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.086738 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:21:14.127897 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.128372 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:21:14.168626 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.169131 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:21:14.209194 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.209712 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:21:14.250333 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.250470 1177669 kubeadm.go:401] StartCluster: {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISer
verNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false
DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:21:14.250558 1177669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:21:14.250623 1177669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:21:14.277777 1177669 command_runner.go:130] > e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563
	I1218 00:21:14.277803 1177669 command_runner.go:130] > 0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c
	I1218 00:21:14.277811 1177669 command_runner.go:130] > 95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e
	I1218 00:21:14.277820 1177669 command_runner.go:130] > 1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5
	I1218 00:21:14.277826 1177669 command_runner.go:130] > 9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1
	I1218 00:21:14.277832 1177669 command_runner.go:130] > 3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad
	I1218 00:21:14.277838 1177669 command_runner.go:130] > cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15
	I1218 00:21:14.277846 1177669 command_runner.go:130] > 38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68
	I1218 00:21:14.277857 1177669 command_runner.go:130] > 1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b
	I1218 00:21:14.277868 1177669 command_runner.go:130] > 61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a
	I1218 00:21:14.277874 1177669 command_runner.go:130] > 98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe
	I1218 00:21:14.277883 1177669 command_runner.go:130] > 891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b
	I1218 00:21:14.277889 1177669 command_runner.go:130] > 2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de
	I1218 00:21:14.277898 1177669 command_runner.go:130] > b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06
	I1218 00:21:14.280281 1177669 cri.go:89] found id: "e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563"
	I1218 00:21:14.280303 1177669 cri.go:89] found id: "0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c"
	I1218 00:21:14.280308 1177669 cri.go:89] found id: "95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e"
	I1218 00:21:14.280312 1177669 cri.go:89] found id: "1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5"
	I1218 00:21:14.280315 1177669 cri.go:89] found id: "9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1"
	I1218 00:21:14.280319 1177669 cri.go:89] found id: "3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad"
	I1218 00:21:14.280323 1177669 cri.go:89] found id: "cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15"
	I1218 00:21:14.280326 1177669 cri.go:89] found id: "38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68"
	I1218 00:21:14.280329 1177669 cri.go:89] found id: "1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b"
	I1218 00:21:14.280337 1177669 cri.go:89] found id: "61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a"
	I1218 00:21:14.280343 1177669 cri.go:89] found id: "98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe"
	I1218 00:21:14.280347 1177669 cri.go:89] found id: "891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b"
	I1218 00:21:14.280355 1177669 cri.go:89] found id: "2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	I1218 00:21:14.280359 1177669 cri.go:89] found id: "b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06"
	I1218 00:21:14.280362 1177669 cri.go:89] found id: ""
	I1218 00:21:14.280415 1177669 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:21:14.291297 1177669 command_runner.go:130] ! time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	W1218 00:21:14.291357 1177669 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	I1218 00:21:14.291439 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:21:14.298396 1177669 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:21:14.298416 1177669 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:21:14.298422 1177669 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:21:14.298426 1177669 command_runner.go:130] > member
	I1218 00:21:14.299333 1177669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:21:14.299377 1177669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:21:14.299453 1177669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:21:14.306750 1177669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:21:14.307329 1177669 kubeconfig.go:125] found "functional-240845" server: "https://192.168.49.2:8441"
	I1218 00:21:14.308688 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.308922 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.310273 1177669 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:21:14.310295 1177669 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:21:14.310301 1177669 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:21:14.310306 1177669 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:21:14.310311 1177669 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:21:14.310598 1177669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:21:14.310964 1177669 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:21:14.321005 1177669 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:21:14.321038 1177669 kubeadm.go:602] duration metric: took 21.641512ms to restartPrimaryControlPlane
	I1218 00:21:14.321068 1177669 kubeadm.go:403] duration metric: took 70.601924ms to StartCluster
	I1218 00:21:14.321095 1177669 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.321175 1177669 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.321832 1177669 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.322054 1177669 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:21:14.322232 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:21:14.322270 1177669 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:21:14.322334 1177669 addons.go:70] Setting storage-provisioner=true in profile "functional-240845"
	I1218 00:21:14.322346 1177669 addons.go:239] Setting addon storage-provisioner=true in "functional-240845"
	W1218 00:21:14.322351 1177669 addons.go:248] addon storage-provisioner should already be in state true
	I1218 00:21:14.322373 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.322797 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.323222 1177669 addons.go:70] Setting default-storageclass=true in profile "functional-240845"
	I1218 00:21:14.323243 1177669 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-240845"
	I1218 00:21:14.323528 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.326222 1177669 out.go:179] * Verifying Kubernetes components...
	I1218 00:21:14.329298 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:14.352407 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.352567 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.353875 1177669 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:21:14.354521 1177669 addons.go:239] Setting addon default-storageclass=true in "functional-240845"
	W1218 00:21:14.354541 1177669 addons.go:248] addon default-storageclass should already be in state true
	I1218 00:21:14.354568 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.355010 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.357054 1177669 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.357084 1177669 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:21:14.357149 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.385892 1177669 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.385914 1177669 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:21:14.385974 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.412313 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.438252 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.538332 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:14.555412 1177669 node_ready.go:35] waiting up to 6m0s for node "functional-240845" to be "Ready" ...
	I1218 00:21:14.556627 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.558515 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:14.558665 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:14.559006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:14.569919 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.635955 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.636102 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.636146 1177669 retry.go:31] will retry after 274.076226ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.646979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.650760 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.650797 1177669 retry.go:31] will retry after 360.821893ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.911221 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.974464 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.974555 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.974595 1177669 retry.go:31] will retry after 225.739861ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.012854 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.055958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.056036 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.056342 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.079682 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.079793 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.079817 1177669 retry.go:31] will retry after 552.403697ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.200970 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.261673 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.261728 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.261746 1177669 retry.go:31] will retry after 669.780864ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.556091 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.632797 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.699577 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.699638 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.699664 1177669 retry.go:31] will retry after 634.295794ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.931763 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.990067 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.993514 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.993545 1177669 retry.go:31] will retry after 1.113615509s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.055858 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:16.334650 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:16.392078 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:16.395777 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.395856 1177669 retry.go:31] will retry after 558.474178ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.556248 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.556629 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:16.556701 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:16.955131 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:17.055832 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.055954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.056319 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:17.076617 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.076722 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.076755 1177669 retry.go:31] will retry after 1.676176244s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.108039 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:17.223472 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.223571 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.223606 1177669 retry.go:31] will retry after 1.165701868s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.556175 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.556607 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.056383 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.056458 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.056745 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.390333 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:18.466841 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.466880 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.466899 1177669 retry.go:31] will retry after 1.475434566s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.556290 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.556363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.556640 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.753095 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:18.817795 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.817871 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.817893 1177669 retry.go:31] will retry after 1.833170296s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:19.056294 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.056363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:19.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:19.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.556536 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.556903 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:19.943440 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:20.003817 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.008032 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.008069 1177669 retry.go:31] will retry after 3.979109659s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.056345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.056668 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.556404 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.556792 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.652153 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:20.711890 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.715639 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.715672 1177669 retry.go:31] will retry after 3.637109781s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:21.056958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.057040 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.057388 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:21.057444 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:21.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.555927 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.556005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.556330 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.056025 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.056094 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.056444 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.556246 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.556345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.556676 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:23.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:23.987349 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:24.051441 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.051487 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.051524 1177669 retry.go:31] will retry after 5.3171516s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.056654 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.056732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.057111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:24.353838 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:24.413422 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.413469 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.413487 1177669 retry.go:31] will retry after 3.340127313s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.555854 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.555928 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:26.056042 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.056124 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.056522 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:26.056585 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:26.556332 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.556411 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.056517 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.056589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.056942 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.555642 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.754507 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:27.812979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:27.813026 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:27.813045 1177669 retry.go:31] will retry after 6.95951013s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:28.056456 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.056550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:28.057006 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:28.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.055872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.368874 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:29.425933 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:29.429391 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.429423 1177669 retry.go:31] will retry after 6.711424265s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.555717 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.055742 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.055823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:30.556179 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:31.055879 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.055958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.056290 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:31.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.556084 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.056199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.555959 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.556028 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.556363 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:32.556413 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:33.055772 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.055844 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.056367 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:33.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.556178 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.055882 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.055955 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.556326 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.556397 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:34.556796 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:34.773144 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:34.829958 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:34.833899 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:34.833929 1177669 retry.go:31] will retry after 8.542321591s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:35.056329 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.056407 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.056770 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:35.556516 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.556605 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.556959 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.057369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.057701 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.141963 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:36.202438 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:36.202477 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.202496 1177669 retry.go:31] will retry after 7.758270018s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:37.055818 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.055893 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:37.056274 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:37.555746 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.555822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.055812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.056254 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.556149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:39.055837 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.056297 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:39.056352 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:39.556245 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.556663 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.056509 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.056606 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.056917 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.555706 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.055770 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.056183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.555731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:41.556201 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:42.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.055854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:42.556028 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.556119 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.556476 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.056276 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.056351 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.056698 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.377156 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:43.435792 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:43.439377 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.439408 1177669 retry.go:31] will retry after 18.255208537s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.556665 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.556738 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.557098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:43.557163 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:43.961544 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:44.047619 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:44.047656 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.047681 1177669 retry.go:31] will retry after 16.124184127s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.055890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.056245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:44.556158 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.556259 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.556606 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:45.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.057068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1218 00:21:45.555737 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.555812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.556144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:46.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:46.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:46.555906 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.556364 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.055801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.555784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.056189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:48.056258 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:48.555948 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.556021 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.556370 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.056069 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.056148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.056513 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.556531 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.556886 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:50.056538 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.056619 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.056964 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:50.057018 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:50.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.556116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.055815 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.055884 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.056198 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.555734 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:52.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:53.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.056005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.056346 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:53.556049 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.056030 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.056102 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.056454 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.556467 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.556542 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.556870 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:54.556927 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:55.055618 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.055704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.056046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:55.555616 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.555993 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.055712 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:57.055721 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:57.056210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:57.555892 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.555965 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.556299 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.555793 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.555877 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.556239 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:59.556292 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:00.055898 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.055985 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.056349 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:00.172859 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:00.349113 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:00.349165 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.349188 1177669 retry.go:31] will retry after 15.178958797s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.556482 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.556554 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.056710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.057020 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.555743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.695619 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:01.764253 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:01.768251 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:01.768286 1177669 retry.go:31] will retry after 20.261734519s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:02.055637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.055714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.056058 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:02.056113 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:02.555751 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.555820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.555659 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.555732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.556022 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:04.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.056080 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:04.056144 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:04.556235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.556662 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.056450 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.056522 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.056859 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.556627 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.556731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.557039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:06.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:06.056174 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:06.555712 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.556120 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.056150 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.555911 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.555990 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.556341 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:08.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.055811 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.056155 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:08.056203 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:08.555901 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.555978 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.556327 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:10.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.055797 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:10.056287 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:10.555954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.556049 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.556390 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.555929 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.056135 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:12.556162 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:13.055804 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.056174 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:13.555873 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.555943 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.056098 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.056468 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.556454 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.556529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.556828 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:14.556880 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:15.056592 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.056660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.057019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.528571 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:15.555957 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.556023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.556331 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.591509 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:15.594869 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:15.594900 1177669 retry.go:31] will retry after 30.932709272s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:16.056512 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.056582 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.056902 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:16.555621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.555718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:17.055743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.056124 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:17.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:17.555739 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.555834 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.055987 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.056365 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.556060 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.556132 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.556480 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:19.056235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.056623 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:19.056697 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:19.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.556660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.556999 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.056621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.056700 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.057051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.555764 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.556195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.055711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:21.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:22.030818 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:22.056348 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.056446 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.056751 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:22.091069 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:22.094766 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.094798 1177669 retry.go:31] will retry after 47.715756714s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.556528 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.556883 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.056081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.555699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.555792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:24.055868 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.055942 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.056302 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:24.056357 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:24.556255 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.556349 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.056263 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.056721 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.556539 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.556623 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.557021 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.055767 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.055851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.555952 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.556027 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:26.556419 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:27.056075 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:27.556423 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.556518 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.056638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.057038 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.555732 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.555814 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.556167 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:29.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:29.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:29.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.055846 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.055931 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.056335 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.556057 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.556129 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.556500 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:31.056282 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.056362 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.056704 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:31.056761 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:31.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.556564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.556896 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.055712 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.556248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.055753 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.556054 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.556161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.556684 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:33.556740 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:34.056460 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.056532 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.056854 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:34.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.556213 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.555889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.556242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:36.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.056089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:36.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:36.555705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.055826 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.056241 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.555756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:38.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.056147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:38.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:38.555776 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.556214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.056109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.555711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.555807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.556166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:40.055954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.056034 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.056481 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:40.056546 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:40.556379 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.556469 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.556814 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.056496 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.056800 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.556572 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.556672 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.557008 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.055744 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.056248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.555835 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.555911 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.556276 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:42.556330 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:43.056000 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.056095 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.056464 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:43.556247 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.556319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.056503 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.056852 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.556012 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.556109 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:44.556512 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:45.057726 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.057809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.058234 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:45.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.556053 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.556425 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.055813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.528809 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:46.556363 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.556430 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.556863 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:46.556912 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:46.592076 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592111 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592213 1177669 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:22:47.056004 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.056462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:47.556264 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.556334 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.556652 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.056410 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.056481 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.056790 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.556515 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.556589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.556921 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:48.556975 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:49.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.055730 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:49.555733 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.555821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.556169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.055866 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.055935 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.555707 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.555815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:51.056519 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.056627 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:51.057002 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:51.555636 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.555709 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.556029 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.055820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.555872 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.555945 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.055695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.055766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.056179 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.555862 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.555934 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:53.556377 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:54.056043 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.056137 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.056494 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:54.556337 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.556432 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.556763 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.056566 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.056640 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.056965 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.555663 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.556084 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:56.056189 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:56.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.556131 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.056558 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.056624 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.056923 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.556638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.556705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.556914 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.055638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.055992 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.556608 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.556858 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:58.556906 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:59.055615 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.055693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.055962 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:59.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.055787 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.055870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.056214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.555725 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:01.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:01.056324 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:01.555994 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.556068 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.556424 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.056333 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.556470 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.556547 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.055624 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.055721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.056047 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:03.556183 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:04.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:04.556084 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.556154 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.556510 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.056318 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.056386 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.056739 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.556502 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.556575 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.556888 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:05.556938 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:06.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.056072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:06.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.555824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.556163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.056617 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.056703 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.057017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.555759 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.556245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:08.055620 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.055732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:08.056120 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:08.555828 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.555904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.055992 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.056064 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.056490 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.556279 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.811086 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:23:09.870262 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873844 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873941 1177669 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:23:09.877215 1177669 out.go:179] * Enabled addons: 
	I1218 00:23:09.880843 1177669 addons.go:530] duration metric: took 1m55.558566134s for enable addons: enabled=[]
	I1218 00:23:10.056212 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.056346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.056713 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:10.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:10.556554 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.556967 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.055656 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.055785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.555809 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.555880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.556212 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.055838 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.556050 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:12.556108 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:13.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.055825 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.056171 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:13.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.556080 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.556462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.055821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.056182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.556315 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.556385 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:14.556741 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:15.056401 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.056470 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.056793 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:15.556411 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.556480 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.556780 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.056647 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.056963 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:17.055850 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.055925 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.056282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:17.056334 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:17.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.556122 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.556497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.056286 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.056361 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.056685 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.556408 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.556802 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:19.056570 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.056646 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.057040 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:19.057095 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:19.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.555860 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.555958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.556317 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.056081 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.056416 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.555830 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.555903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:21.556323 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:22.056015 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.056091 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.056432 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:22.556188 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.556285 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.556619 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.056387 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.056459 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.056805 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.556577 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.556991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:23.557043 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:24.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:24.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.556090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.556429 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.056237 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.056319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.056651 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.556407 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.556484 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.556804 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:26.056601 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.056678 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.057039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:26.057096 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:26.556349 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.556417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.556670 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.056529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.555635 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.555714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.556073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.556018 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.556350 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:28.556398 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:29.056066 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.056141 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.056558 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:29.556410 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.556482 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.556819 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.056192 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.056697 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.556347 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.556425 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.556813 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:30.556882 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:31.056651 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.056724 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.057110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:31.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:33.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:33.056171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:33.556520 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.556626 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.557595 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.055711 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.555832 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.555907 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.556266 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:35.055820 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.056262 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:35.056317 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:35.555980 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.556055 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.556475 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.056253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.056329 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.056689 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.556253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.556322 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.556585 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:37.056343 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.056417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.056777 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:37.056832 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:37.556557 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.556628 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.055768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.055671 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.056076 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.555915 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.555994 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:39.556347 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:40.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.056458 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:40.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.556320 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.556648 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.056467 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.056544 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.556710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:41.557023 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:42.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:42.555713 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.055805 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.055880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.056188 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:44.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.055912 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.056273 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:44.056335 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:44.555920 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.555993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.556368 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.058236 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.058319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.058728 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.556602 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.556934 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.055727 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.056067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.556151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:46.556209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:47.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.056162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:47.555674 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.055810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.556441 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.556514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.556832 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:48.556892 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:49.056604 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.056674 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.057002 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:49.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.555771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.055675 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:51.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:51.056177 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:51.555865 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.555937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.556310 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.555777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:53.055816 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.055886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.056231 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:53.056281 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:53.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.556010 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.556362 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.556026 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.556097 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.556417 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.055678 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.056101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.555734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:55.556129 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:56.055730 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:56.555867 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.555946 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.556300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.056090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.056457 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.556250 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.556323 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.556654 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:57.556712 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:58.056487 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:58.555623 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.055906 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.055982 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.056358 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.555710 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.555803 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:00.059640 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.059720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.060067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:00.060115 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:00.556240 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.556671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.056422 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.056490 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.056823 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.556575 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.556648 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.556984 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.056108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.555781 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:02.556145 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:03.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:03.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.555789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.056105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.556112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:04.556502 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:05.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.056331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.056671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:05.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.556557 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.055645 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.055718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.056059 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.555753 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.556081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:07.056275 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.056343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.056625 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:07.056668 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:07.556435 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.556511 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.055588 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.055660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.056003 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.055807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.055881 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.056246 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.555809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:09.556165 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:10.055865 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.055961 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.056303 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:10.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.556089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.055834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.056300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.556519 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:11.556574 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:12.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.056402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.056727 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:12.556537 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.556620 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.556973 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.055664 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.555826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.556183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:14.055916 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.055993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.056372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:14.056425 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:14.556306 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.556384 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.556722 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.056477 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.056546 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.056868 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.556714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.557060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.556138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:16.556191 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:17.055685 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.056060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:17.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.056016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.555699 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:19.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:19.056164 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:19.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.055623 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.055701 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.056014 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.555727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.055681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:21.556151 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:22.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.056666 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:22.556365 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.556434 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.556767 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.056542 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.056618 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.056908 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.555630 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.556032 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:24.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:24.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:24.556065 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.556148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.556518 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.056084 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.056512 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.556289 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.556691 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:26.056500 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.056600 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.056966 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:26.057019 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:26.555679 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.556107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.055812 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.556038 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.556103 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:28.556166 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:29.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.555778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.555730 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:30.556245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:31.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:31.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.055797 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.555678 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:33.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:33.056107 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:33.555768 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.556187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.055758 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.055833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.556178 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:35.056326 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.056396 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.056747 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:35.056801 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:35.556586 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.556657 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.557044 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.055782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.555908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.556282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:37.056627 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.057073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:37.057140 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:37.555781 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.555850 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.055869 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.055947 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.056284 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.055844 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.055924 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.056291 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:39.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:40.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:40.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.056055 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.555718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.556072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:42.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:42.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:42.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.556016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.556296 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.556369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:44.556718 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:45.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.057363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.057788 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:45.556579 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.556652 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.556974 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.555790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:47.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.055779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.056086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:47.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:47.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.555807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.555758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.556101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:49.556155 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:50.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.056148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:50.555821 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.555892 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.556250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.056032 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.056394 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:51.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:52.055852 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.056308 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:52.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.556071 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.556442 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.056207 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.056295 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.056620 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.556372 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.556448 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.556772 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:53.556829 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:54.056587 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.056664 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.057045 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:54.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.556382 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.056125 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:56.055825 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.055904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.056298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:56.056356 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:56.556003 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.556076 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.056092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.555906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.556260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:58.556314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:59.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.055748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:59.555874 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.555954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.055976 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.056062 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.056441 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.556138 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.556250 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.556688 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:00.556759 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:01.056530 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.056604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.056936 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:01.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.555731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.056275 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.555950 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.556022 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.556393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:03.056088 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.056161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.056527 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:03.056581 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:03.556314 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.556377 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.556644 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.056325 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.056398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.056741 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.556656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.556735 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.557093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.055643 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.556394 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.556464 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.556836 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:05.556889 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:06.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.057085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:06.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.056100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.556130 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:08.055819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.056255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:08.056314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:08.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.556052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.556389 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.556157 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.556549 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:10.056352 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.056426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.056786 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:10.056843 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:10.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.556658 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.557006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.055714 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.555890 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.555963 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.556324 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.056036 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.056112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.056497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.556273 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.556347 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:12.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:13.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.056492 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.056825 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:13.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.556340 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.556615 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.055978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.056067 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.555796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.556186 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:15.055759 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.055835 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.056169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:15.056244 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:15.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.556434 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.056196 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.056293 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.056642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.556550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.556891 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.055661 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.056001 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.556096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:17.556152 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:18.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:18.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.055840 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.055916 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.555697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:19.556192 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:20.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:20.555796 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.555870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.556255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.055952 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.056023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.056395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.556064 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.556138 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.556504 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:21.556557 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:22.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.056350 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.056700 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:22.556495 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.556573 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.556915 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.055663 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.055991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.555759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:24.055734 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.055816 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.056168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:24.056239 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:24.556073 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.556516 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:26.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.055868 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.056210 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:26.056286 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:26.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.556109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.055906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.056250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.555667 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.556156 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:28.556210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:29.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:29.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.056614 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.555633 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.555715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:31.055809 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.056307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:31.056368 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:31.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.055756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.555778 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.555851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:33.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.056351 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:33.056404 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:33.556040 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.556451 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.056004 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.056660 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.556087 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.555845 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.556288 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:35.556349 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:36.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:36.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.555888 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:38.055607 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.055689 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.056039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:38.056098 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:38.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.055853 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.056286 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.556210 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.556639 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:40.056445 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.056865 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:40.056930 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:40.556620 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.556695 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.557017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.555789 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.555863 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.556189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.056163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.555934 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.556328 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:42.556374 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:43.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:43.555816 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.556292 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.055998 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.056073 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.556595 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.556924 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:44.556979 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:45.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.056201 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:45.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.056492 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.056876 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.556642 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.556716 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.557036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:46.557089 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:47.055763 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.055839 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:47.555908 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.555986 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.556307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.055720 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.556093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:49.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.056114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:49.056169 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:49.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.055833 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.055910 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.056293 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.555974 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.556043 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:51.056056 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.056128 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.056465 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:51.056513 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:51.556270 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.556344 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.556681 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.056466 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.056539 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.056895 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.555612 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.555693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.556206 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.055909 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.055981 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:53.556175 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:54.055792 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.056260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:54.556080 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.556156 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.556472 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.555651 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.556079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:56.056196 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:56.555837 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.555913 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.056033 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.056356 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.056107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.555780 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.555854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.556190 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:58.556260 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:59.055926 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.056011 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:59.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.556343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.556642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.059082 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.059161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.059514 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.556566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.556913 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:00.556965 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:01.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.055719 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:01.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.555754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:03.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.056098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:03.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:03.555672 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.056115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.556167 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.556265 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.556617 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:05.056442 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.056514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:05.056907 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:05.556597 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.556667 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.556997 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.556092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.055659 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.055728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.056019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:07.556123 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:08.055665 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.055743 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.056061 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:08.555631 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.555705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.055707 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.055787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.555670 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.556065 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:10.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.055794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:10.056268 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:10.555749 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.555819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.556146 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.055855 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:12.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:13.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:13.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.555886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.556252 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.055957 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.056026 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.056385 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.556596 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.556938 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:14.556992 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:15.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:15.555805 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.555887 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.055807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.555813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:17.055842 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.055915 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.056281 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:17.056342 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:17.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.555824 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.555898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.556258 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.055724 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.555983 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.556056 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.556402 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:19.556458 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:20.056180 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.056281 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.056653 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:20.556480 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.556560 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.056634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.056717 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.057043 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:22.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.055734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.056082 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:22.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:22.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.556259 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.055950 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.056030 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.056393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.555649 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.556074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.055763 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.556096 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.556167 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.556536 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:24.556589 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:25.056122 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.056197 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.056567 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:25.556331 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.556402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.556737 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.056615 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.056954 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.555650 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:27.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:27.056160 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:27.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.555829 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.055860 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.055937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.556069 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.556395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:29.056209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:29.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.055732 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.055808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.056154 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.555757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:31.055778 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.055856 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:31.056278 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:31.555669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.555744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.555701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:33.055858 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.056301 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:33.056353 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:33.556038 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.556445 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.056214 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.056311 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.056650 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.556604 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.556677 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.557012 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:35.056645 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.056718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.057052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:35.057102 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:35.555605 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.555680 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.556018 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.055752 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.055826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.056172 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.556126 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.055668 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.056096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.555797 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.555867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.556203 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:37.556272 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:38.055962 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.056083 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.056495 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:38.556271 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.556346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.556695 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.055905 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.055976 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.056296 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.556206 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:39.556792 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:40.056703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.056787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.057218 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:40.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.555750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:42.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.055860 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:42.056301 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:42.555953 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.556025 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.556420 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.555853 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.556039 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.556118 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:44.556541 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:45.055766 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.055855 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:45.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.555793 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:47.055949 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.056052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.056438 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:47.056488 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:47.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.556324 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.556696 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.056448 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.056853 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.556536 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.556604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.556871 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:49.056619 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.056692 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.057011 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:49.057072 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:49.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.556115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.055723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.555726 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.555805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:51.556259 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:52.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.056215 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:52.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.056106 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.555801 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.556202 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:53.556285 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:54.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.056407 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:54.556321 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.556398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.557023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.055680 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.555795 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.555866 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.556181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:56.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.055805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.056166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:56.056245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:56.555703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.555700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.556119 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:58.556172 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:59.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.055903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:59.555918 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.555995 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.056095 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.056184 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.056542 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.556340 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.556426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.556768 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:00.556820 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:01.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.056617 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.056937 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:01.555665 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.056048 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.056120 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.056471 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.556307 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.556375 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:03.056489 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.056566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.056907 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:03.056959 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:03.556548 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.556616 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.556947 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.055614 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.055691 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.056023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.556067 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.556168 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.055755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.056077 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.556113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:05.556171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:06.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.056074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:06.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.555767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.055795 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.555673 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.556083 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:08.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.055846 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.056205 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:08.056297 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:08.555965 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.556379 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.055815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.056111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.556064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:10.556121 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:11.055789 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:11.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.055985 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.555638 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.555713 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.556042 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:13.055626 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.055698 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.056031 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:13.056081 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:13.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.556147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.055908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:14.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.555886 1177669 node_ready.go:38] duration metric: took 6m0.000394955s for node "functional-240845" to be "Ready" ...
	I1218 00:27:14.559015 1177669 out.go:203] 
	W1218 00:27:14.562031 1177669 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:27:14.562056 1177669 out.go:285] * 
	W1218 00:27:14.564187 1177669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:27:14.567133 1177669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:26:34 functional-240845 crio[2318]: time="2025-12-18T00:26:34.986396852Z" level=info msg="Started container" PID=3029 containerID=3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0 description=kube-system/storage-provisioner/storage-provisioner id=92da38d7-d7a2-498b-be1f-6a608d2863bd name=/runtime.v1.RuntimeService/StartContainer sandboxID=552e688f4b2fbc14144d1338b2c4dcb19f6ce8e6a4c97e972802d45e8f302aae
	Dec 18 00:26:35 functional-240845 conmon[3027]: conmon 3051bfe26a7bd174b56e <ninfo>: container 3029 exited with status 1
	Dec 18 00:26:35 functional-240845 crio[2318]: time="2025-12-18T00:26:35.375814642Z" level=info msg="Removing container: d33eff34065a8845822e47f7213828946feb3b91cb5a51c36cdfef4948a3702b" id=d73da053-cad3-418a-9eeb-f7ed8c9f2276 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:26:35 functional-240845 crio[2318]: time="2025-12-18T00:26:35.383466329Z" level=info msg="Error loading conmon cgroup of container d33eff34065a8845822e47f7213828946feb3b91cb5a51c36cdfef4948a3702b: cgroup deleted" id=d73da053-cad3-418a-9eeb-f7ed8c9f2276 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:26:35 functional-240845 crio[2318]: time="2025-12-18T00:26:35.386625146Z" level=info msg="Removed container d33eff34065a8845822e47f7213828946feb3b91cb5a51c36cdfef4948a3702b: kube-system/storage-provisioner/storage-provisioner" id=d73da053-cad3-418a-9eeb-f7ed8c9f2276 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.961463725Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=240ed625-f345-4390-995a-2a00ffe5d533 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.962777924Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=83b0019c-125f-488a-b206-d72df501b471 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.963805181Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=00f1b8b0-efe0-4fa9-a417-a72092c89009 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.964001589Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.968079081Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=00f1b8b0-efe0-4fa9-a417-a72092c89009 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.960919479Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=8553d3bc-767f-41e0-83b6-410ac05d4c37 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.961885044Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=81af4700-b21e-478b-99ab-34aa44b35818 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.962802996Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=8bcfd0f5-e059-469e-b064-091f9cedf10c name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.962900684Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.967174173Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=8bcfd0f5-e059-469e-b064-091f9cedf10c name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.963021789Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=2b4ec4ab-8335-4e22-a3cd-c579780fa2ca name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.964640183Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=87bae2c1-8fd4-4ec1-9196-cae09ea601e4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.965602499Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=8678d9a2-1c51-4d52-b1a2-f3774e105249 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.965701959Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.96965778Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=8678d9a2-1c51-4d52-b1a2-f3774e105249 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.964208217Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=aa0e3d92-0554-4021-a6d9-7af588b78771 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.966641043Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=c61b476e-8cdc-4380-b5f7-22466ba406b3 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.9686077Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=04ca7006-7f30-4412-ad63-4fd0c4347029 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.96872005Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.972930444Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=04ca7006-7f30-4412-ad63-4fd0c4347029 name=/runtime.v1.RuntimeService/CreateContainer
	
	
	==> container status <==
	CONTAINER           IMAGE                                                              CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	3051bfe26a7bd       ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6   41 seconds ago      Exited              storage-provisioner       6                   552e688f4b2fb       storage-provisioner                         kube-system
	56af7390805be       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22   2 minutes ago       Exited              kube-controller-manager   5                   d9cddccbc36e9       kube-controller-manager-functional-240845   kube-system
	3df4b23cd1fc9       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   5 minutes ago       Running             kube-proxy                2                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	9b3fcd7bdcddc       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   5 minutes ago       Running             kindnet-cni               2                   2557f167a47ed       kindnet-84qbm                               kube-system
	fb962917a931f       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   5 minutes ago       Running             etcd                      2                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	f6d062f0f43f4       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   5 minutes ago       Running             kube-scheduler            2                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	45ca9ca01a676       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   6 minutes ago       Running             coredns                   2                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	e79c8e6ec8375       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   7 minutes ago       Exited              kube-proxy                1                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	0fe4c80fa2adf       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   7 minutes ago       Exited              kindnet-cni               1                   2557f167a47ed       kindnet-84qbm                               kube-system
	95d915f37e740       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   7 minutes ago       Exited              etcd                      1                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	9caeb1dccc679       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   7 minutes ago       Exited              kube-scheduler            1                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	cf507cc725a8d       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   7 minutes ago       Exited              coredns                   1                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	2b9f193a1520d       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896   8 minutes ago       Exited              kube-apiserver            0                   e04fd252da213       kube-apiserver-functional-240845            kube-system
	
	
	==> coredns [45ca9ca01a676570a0535560af08d4e95f72145d9702ec8b798ce70d833c0356] <==
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	
	
	==> coredns [cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15] <==
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] plugin/health: Going into lameduck mode for 5s
	[INFO] 127.0.0.1:42709 - 40478 "HINFO IN 7841480554586397634.8984575394038029725. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043241905s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e] <==
	{"level":"info","ts":"2025-12-18T00:19:42.456949Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.459892Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.464260Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:19:42.464361Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:19:42.473581Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:19:42.473686Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.512923Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.860469Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-12-18T00:19:42.860559Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-12-18T00:19:42.860705Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.862987Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.863085Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.863106Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"info","ts":"2025-12-18T00:19:42.863197Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-12-18T00:19:42.863210Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863328Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863350Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863361Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863401Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863409Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863417Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.876941Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-12-18T00:19:42.877021Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.877059Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:19:42.877083Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [fb962917a931fb777a305b1b6998e379972e4d38499641f5d582e94ff93708b1] <==
	{"level":"info","ts":"2025-12-18T00:21:16.766111Z","caller":"fileutil/purge.go:49","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2025-12-18T00:21:16.766332Z","caller":"embed/etcd.go:640","msg":"serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.766371Z","caller":"embed/etcd.go:611","msg":"cmux::serve","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.767189Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1981","msg":"aec36adc501070cc switched to configuration voters=(12593026477526642892)"}
	{"level":"info","ts":"2025-12-18T00:21:16.767293Z","caller":"membership/cluster.go:433","msg":"ignore already added member","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","added-peer-id":"aec36adc501070cc","added-peer-peer-urls":["https://192.168.49.2:2380"],"added-peer-is-learner":false}
	{"level":"info","ts":"2025-12-18T00:21:16.767389Z","caller":"membership/cluster.go:674","msg":"updated cluster version","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","from":"3.6","to":"3.6"}
	{"level":"info","ts":"2025-12-18T00:21:17.452278Z","logger":"raft","caller":"v3@v3.6.0/raft.go:988","msg":"aec36adc501070cc is starting a new election at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452423Z","logger":"raft","caller":"v3@v3.6.0/raft.go:930","msg":"aec36adc501070cc became pre-candidate at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452498Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgPreVoteResp from aec36adc501070cc at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452538Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.452580Z","logger":"raft","caller":"v3@v3.6.0/raft.go:912","msg":"aec36adc501070cc became candidate at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458217Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgVoteResp from aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458311Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.458356Z","logger":"raft","caller":"v3@v3.6.0/raft.go:970","msg":"aec36adc501070cc became leader at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458401Z","logger":"raft","caller":"v3@v3.6.0/node.go:370","msg":"raft.node: aec36adc501070cc elected leader aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.464392Z","caller":"etcdserver/server.go:1820","msg":"published local member to cluster through raft","local-member-id":"aec36adc501070cc","local-member-attributes":"{Name:functional-240845 ClientURLs:[https://192.168.49.2:2379]}","cluster-id":"fa54960ea34d58be","publish-timeout":"7s"}
	{"level":"info","ts":"2025-12-18T00:21:17.464576Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.465463Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.467639Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:21:17.469663Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.484256Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:21:17.484501Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:21:17.485336Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:21:17.490008Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.492585Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	
	
	==> kernel <==
	 00:27:16 up  7:09,  0 user,  load average: 0.01, 0.39, 1.11
	Linux functional-240845 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c] <==
	I1218 00:19:41.706085       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 00:19:41.706608       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1218 00:19:41.706796       1 main.go:148] setting mtu 1500 for CNI 
	I1218 00:19:41.706849       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 00:19:41.706886       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T00:19:41Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	E1218 00:19:41.884497       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	I1218 00:19:41.884891       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 00:19:41.884912       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 00:19:41.884921       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 00:19:41.885210       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1218 00:19:41.885327       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:19:41.885420       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:19:41.885714       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:19:42.810791       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	
	
	==> kindnet [9b3fcd7bdcddc7326e7d4c50ecf0ebeef85e8ebe52719009cafb599db42b74a4] <==
	E1218 00:22:39.976452       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:23:02.096720       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:23:15.308322       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:23:24.321432       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:23:30.188055       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:23:40.934269       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:23:50.555354       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:11.192172       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:24:18.295989       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:20.463220       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:24:45.334437       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:50.163434       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:56.363905       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:14.032706       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:25:16.121579       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:25:27.942524       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:48.335255       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:25:56.102936       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:05.238670       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:26:10.015965       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:26:40.630863       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:43.534436       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:26:50.773423       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:27:04.308560       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:27:16.542428       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	
	
	==> kube-apiserver [2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de] <==
	W1218 00:19:35.450481       1 logging.go:55] [core] [Channel #135 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450516       1 logging.go:55] [core] [Channel #2 SubChannel #6]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450551       1 logging.go:55] [core] [Channel #7 SubChannel #9]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450585       1 logging.go:55] [core] [Channel #26 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450620       1 logging.go:55] [core] [Channel #231 SubChannel #233]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451088       1 logging.go:55] [core] [Channel #131 SubChannel #133]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451140       1 logging.go:55] [core] [Channel #255 SubChannel #257]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451176       1 logging.go:55] [core] [Channel #163 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451218       1 logging.go:55] [core] [Channel #187 SubChannel #189]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	I1218 00:19:35.461000       1 controller.go:128] Shutting down kubernetes service endpoint reconciler
	W1218 00:19:35.463755       1 logging.go:55] [core] [Channel #139 SubChannel #141]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463898       1 logging.go:55] [core] [Channel #175 SubChannel #177]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463993       1 logging.go:55] [core] [Channel #235 SubChannel #237]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464077       1 logging.go:55] [core] [Channel #143 SubChannel #145]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464158       1 logging.go:55] [core] [Channel #199 SubChannel #201]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464191       1 logging.go:55] [core] [Channel #55 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464387       1 logging.go:55] [core] [Channel #91 SubChannel #93]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464473       1 logging.go:55] [core] [Channel #103 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464539       1 logging.go:55] [core] [Channel #223 SubChannel #225]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464601       1 logging.go:55] [core] [Channel #107 SubChannel #109]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464323       1 logging.go:55] [core] [Channel #155 SubChannel #157]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464353       1 logging.go:55] [core] [Channel #99 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464694       1 logging.go:55] [core] [Channel #227 SubChannel #229]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464564       1 logging.go:55] [core] [Channel #247 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90] <==
	I1218 00:24:41.593979       1 serving.go:386] Generated self-signed cert in-memory
	I1218 00:24:44.118744       1 controllermanager.go:191] "Starting" version="v1.34.3"
	I1218 00:24:44.118773       1 controllermanager.go:193] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 00:24:44.120180       1 dynamic_cafile_content.go:161] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I1218 00:24:44.120356       1 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1218 00:24:44.120599       1 secure_serving.go:211] Serving securely on 127.0.0.1:10257
	I1218 00:24:44.120986       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E1218 00:24:54.122993       1 controllermanager.go:245] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.49.2:8441/healthz\": dial tcp 192.168.49.2:8441: connect: connection refused"
	
	
	==> kube-proxy [3df4b23cd1fc91cb6876fab74b357bb139f1ea48223b502c7dd9c80ea84c8387] <==
	I1218 00:21:20.402926       1 server_linux.go:53] "Using iptables proxy"
	I1218 00:21:20.489835       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1218 00:21:20.490690       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:21.505281       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:24.355725       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:29.225897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:37.736765       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:52.064415       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:22:27.480420       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:23:19.153551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:24:11.289733       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:05.391565       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:37.545166       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:16.451554       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:57.580830       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	
	
	==> kube-proxy [e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563] <==
	
	
	==> kube-scheduler [9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1] <==
	I1218 00:19:42.975868       1 serving.go:386] Generated self-signed cert in-memory
	W1218 00:19:43.469835       1 authentication.go:397] Error looking up in-cluster authentication configuration: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": dial tcp 192.168.49.2:8441: connect: connection refused
	W1218 00:19:43.469867       1 authentication.go:398] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1218 00:19:43.469874       1 authentication.go:399] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1218 00:19:43.478778       1 server.go:175] "Starting Kubernetes Scheduler" version="v1.34.3"
	I1218 00:19:43.478807       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	E1218 00:19:43.478827       1 event.go:401] "Unable start event watcher (will not retry!)" err="broadcaster already stopped"
	I1218 00:19:43.480968       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481035       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481365       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	E1218 00:19:43.481433       1 server.go:286] "handlers are not fully synchronized" err="context canceled"
	E1218 00:19:43.481498       1 shared_informer.go:352] "Unable to sync caches" logger="UnhandledError" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481515       1 configmap_cafile_content.go:213] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481533       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 00:19:43.481546       1 tlsconfig.go:258] "Shutting down DynamicServingCertificateController"
	I1218 00:19:43.481689       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1218 00:19:43.481706       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1218 00:19:43.481710       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1218 00:19:43.481721       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [f6d062f0f43f4922799fb3880d16e341783d4d7d586d7db4a50fb1085ef76e6e] <==
	E1218 00:26:18.206826       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:26:18.531686       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:26:19.142467       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:26:19.248580       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:26:22.795585       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:26:23.239244       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:26:27.453246       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:28.064817       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 00:26:28.967293       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:26:33.949400       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 00:26:35.154591       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:26:40.179897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:26:42.511252       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1218 00:26:44.107019       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:26:46.364045       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1218 00:26:51.526793       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 00:26:53.205541       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:26:53.282506       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1218 00:26:56.102986       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: Get \"https://192.168.49.2:8441/api/v1/replicationcontrollers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 00:26:58.988588       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:27:00.599087       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 00:27:06.721194       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:27:09.382892       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:27:10.196478       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:27:11.331530       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	
	
	==> kubelet <==
	Dec 18 00:27:03 functional-240845 kubelet[1315]: E1218 00:27:03.594006    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:07 functional-240845 kubelet[1315]: E1218 00:27:07.257742    1315 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/events/kube-scheduler-functional-240845.18822743ba5c43bb\": dial tcp 192.168.49.2:8441: connect: connection refused" event="&Event{ObjectMeta:{kube-scheduler-functional-240845.18822743ba5c43bb  kube-system    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-scheduler-functional-240845,UID:8e5e0ee0f3cd0bbcd38493dce832a8ff,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://127.0.0.1:10259/readyz\": dial tcp 127.0.0.1:10259: connect: connection refused,Source:EventSource{Component:kubelet,Host:functional-240845,},FirstTimestamp:2025-12-18 00:19:35.725556667 +0000 UTC m=+22.878905798,LastTimestamp:2025-12-18 00:19:36.72607
7626 +0000 UTC m=+23.879426766,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:functional-240845,}"
	Dec 18 00:27:10 functional-240845 kubelet[1315]: E1218 00:27:10.595150    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:11 functional-240845 kubelet[1315]: I1218 00:27:11.960954    1315 scope.go:117] "RemoveContainer" containerID="3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0"
	Dec 18 00:27:11 functional-240845 kubelet[1315]: E1218 00:27:11.961545    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(36dc300a-a099-40d7-874e-e5c2b3795445)\"" pod="kube-system/storage-provisioner" podUID="36dc300a-a099-40d7-874e-e5c2b3795445"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.055596    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a\\\",\\\"docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1\\\",\\\"docker.io/kinde
st/kindnetd:v20250512-df8de77b\\\"],\\\"sizeBytes\\\":111333938},{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae\\\",\\\"docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3\\\",\\\"docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88\\\"],\\\"sizeBytes\\\":108362109},{\\\"names\\\":[\\\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\\\",\\\"registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5\\\",\\\"registry.k8s.io/kube-apiserver:v1.34.3\\\"],\\\"sizeBytes\\\":84818927},{\\\"names\\\":[\\\"registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86\\\",\\\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\\\",\\\"registry.k8s.io/kube-proxy:v1.34.3\\\"],\\\"sizeBytes\\\":75941783},{\\\"names
\\\":[\\\"registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789\\\",\\\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\\\",\\\"registry.k8s.io/coredns/coredns:v1.12.1\\\"],\\\"sizeBytes\\\":73195387},{\\\"names\\\":[\\\"registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1\\\",\\\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\\\",\\\"registry.k8s.io/kube-controller-manager:v1.34.3\\\"],\\\"sizeBytes\\\":72629077},{\\\"names\\\":[\\\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\\\",\\\"registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e\\\",\\\"registry.k8s.io/etcd:3.6.5-0\\\"],\\\"sizeBytes\\\":60857170},{\\\"names\\\":[\\\"registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf
0ebc01098ef92965cc371eabcb9611\\\",\\\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\\\",\\\"registry.k8s.io/kube-scheduler:v1.34.3\\\"],\\\"sizeBytes\\\":51592021},{\\\"names\\\":[\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2\\\",\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944\\\",\\\"gcr.io/k8s-minikube/storage-provisioner:v5\\\"],\\\"sizeBytes\\\":29037500},{\\\"names\\\":[\\\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\\\",\\\"registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f\\\",\\\"registry.k8s.io/pause:3.10.1\\\"],\\\"sizeBytes\\\":519884}]}}\" for node \"functional-240845\": Patch \"https://192.168.49.2:8441/api/v1/nodes/functional-240845/status?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.056192    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.056534    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.056816    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.057023    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.057048    1315 kubelet_node_status.go:473] "Unable to update node status" err="update node status exceeds retry count"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: I1218 00:27:12.961104    1315 scope.go:117] "RemoveContainer" containerID="2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.961521    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kindnet-84qbm\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="046ced09-dec4-43cb-848e-b84560229897" pod="kube-system/kindnet-84qbm"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.966149    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/storage-provisioner\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="36dc300a-a099-40d7-874e-e5c2b3795445" pod="kube-system/storage-provisioner"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.973314    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/etcd-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="9257aaeefd3fa4168607b7fbbc0bc32d" pod="kube-system/etcd-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.973863    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-apiserver-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="deb3e5bf338d69244d476364f7618b54" pod="kube-system/kube-apiserver-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.974191    1315 log.go:32] "CreateContainer in sandbox from runtime service failed" err="rpc error: code = Unknown desc = the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" podSandboxID="e04fd252da21318ab96dfa8b10e5404c17e6ae263ccbb9e9f922d43a78607f1a"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.974364    1315 kuberuntime_manager.go:1449] "Unhandled Error" err="container kube-apiserver start failed in pod kube-apiserver-functional-240845_kube-system(deb3e5bf338d69244d476364f7618b54): CreateContainerError: the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" logger="UnhandledError"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.974485    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CreateContainerError: \"the container name \\\"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\\\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use\"" pod="kube-system/kube-apiserver-functional-240845" podUID="deb3e5bf338d69244d476364f7618b54"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.975690    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-controller-manager-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="6aa5c667ab761331e5a16029bab33485" pod="kube-system/kube-controller-manager-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.976005    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-scheduler-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="8e5e0ee0f3cd0bbcd38493dce832a8ff" pod="kube-system/kube-scheduler-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.976667    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-proxy-kr6r5\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="86ad3ff0-4da0-4019-8dc4-c0b794c26b01" pod="kube-system/kube-proxy-kr6r5"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.976971    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-mrclk\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="39971787-690f-4cc8-814a-be70de00c6a9" pod="kube-system/coredns-66bc5c9577-mrclk"
	Dec 18 00:27:15 functional-240845 kubelet[1315]: I1218 00:27:15.961084    1315 scope.go:117] "RemoveContainer" containerID="56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90"
	Dec 18 00:27:15 functional-240845 kubelet[1315]: E1218 00:27:15.961278    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=kube-controller-manager-functional-240845_kube-system(6aa5c667ab761331e5a16029bab33485)\"" pod="kube-system/kube-controller-manager-functional-240845" podUID="6aa5c667ab761331e5a16029bab33485"
	
	
	==> storage-provisioner [3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0] <==
	I1218 00:26:34.997517       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1218 00:26:34.998962       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845: exit status 2 (376.723835ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-240845" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctional/serial/SoftStart (464.23s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (3s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-240845 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-240845 get po -A: exit status 1 (67.342459ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-240845 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-240845 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-240845 get po -A"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctional/serial/KubectlGetPods]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctional/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-240845
helpers_test.go:244: (dbg) docker inspect functional-240845:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	        "Created": "2025-12-18T00:18:49.336039923Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1175534,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:18:49.397861382Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hostname",
	        "HostsPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hosts",
	        "LogPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2-json.log",
	        "Name": "/functional-240845",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-240845:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-240845",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	                "LowerDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/merged",
	                "UpperDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/diff",
	                "WorkDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-240845",
	                "Source": "/var/lib/docker/volumes/functional-240845/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-240845",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-240845",
	                "name.minikube.sigs.k8s.io": "functional-240845",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "80ff640c2f3e079a9c83df8e9e88ea18985e04567ee70a1bf3deb87b69d7a9ef",
	            "SandboxKey": "/var/run/docker/netns/80ff640c2f3e",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33920"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33921"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33924"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33922"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33923"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-240845": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "f2:33:56:5f:da:77",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3f9ded1bec62ca4e0acc6643285f4a8aef2088de15bf9d1e6dbf478246c82ae7",
	                    "EndpointID": "a267c79a59d712dbf268b4db11b833499096e030f2777b578bf84c7f9519c961",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-240845",
	                        "5d3e3e2a238b"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845: exit status 2 (321.956265ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctional/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctional/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 logs -n 25: (1.615736978s)
helpers_test.go:261: TestFunctional/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                          ARGS                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ addons  │ addons-399099 addons disable cloud-spanner --alsologtostderr -v=1                                                       │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:16 UTC │                     │
	│ ip      │ addons-399099 ip                                                                                                        │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ addons-399099 addons disable ingress-dns --alsologtostderr -v=1                                                         │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │                     │
	│ addons  │ addons-399099 addons disable ingress --alsologtostderr -v=1                                                             │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │                     │
	│ stop    │ -p addons-399099                                                                                                        │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ enable dashboard -p addons-399099                                                                                       │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ disable dashboard -p addons-399099                                                                                      │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ addons  │ disable gvisor -p addons-399099                                                                                         │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ delete  │ -p addons-399099                                                                                                        │ addons-399099     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:17 UTC │
	│ start   │ -p nospam-499800 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-499800 --driver=docker  --container-runtime=crio │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:17 UTC │ 18 Dec 25 00:18 UTC │
	│ start   │ nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ start   │ nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ start   │ nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                                      │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                                      │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                                      │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                                         │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                                         │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                                         │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ delete  │ -p nospam-499800                                                                                                        │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ start   │ -p functional-240845 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio           │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:19 UTC │
	│ start   │ -p functional-240845 --alsologtostderr -v=8                                                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:19 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:19:34
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:19:34.105121 1177669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:19:34.105346 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105377 1177669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:19:34.105397 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105673 1177669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:19:34.106120 1177669 out.go:368] Setting JSON to false
	I1218 00:19:34.107069 1177669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25322,"bootTime":1765991852,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:19:34.107165 1177669 start.go:143] virtualization:  
	I1218 00:19:34.110567 1177669 out.go:179] * [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:19:34.114275 1177669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:19:34.114378 1177669 notify.go:221] Checking for updates...
	I1218 00:19:34.120029 1177669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:19:34.122925 1177669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:19:34.125751 1177669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:19:34.128638 1177669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:19:34.131461 1177669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:19:34.134887 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:34.134985 1177669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:19:34.159427 1177669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:19:34.159542 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.223972 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.214884618 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.224090 1177669 docker.go:319] overlay module found
	I1218 00:19:34.227220 1177669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:19:34.229963 1177669 start.go:309] selected driver: docker
	I1218 00:19:34.229985 1177669 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false D
isableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.230103 1177669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:19:34.230199 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.285040 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.2764408 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.285449 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:19:34.285507 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:19:34.285561 1177669 start.go:353] cluster config:
	{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetC
lientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.290381 1177669 out.go:179] * Starting "functional-240845" primary control-plane node in "functional-240845" cluster
	I1218 00:19:34.293210 1177669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:19:34.297960 1177669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:19:34.300783 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:19:34.300829 1177669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:19:34.300855 1177669 cache.go:65] Caching tarball of preloaded images
	I1218 00:19:34.300881 1177669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:19:34.300940 1177669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:19:34.300950 1177669 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:19:34.301056 1177669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/config.json ...
	I1218 00:19:34.320164 1177669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:19:34.320186 1177669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:19:34.320203 1177669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:19:34.320279 1177669 start.go:360] acquireMachinesLock for functional-240845: {Name:mk3ed718f4cde9dd7b19ef8d5bcd86c3175b5067 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:19:34.320350 1177669 start.go:364] duration metric: took 45.89µs to acquireMachinesLock for "functional-240845"
	I1218 00:19:34.320375 1177669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:19:34.320383 1177669 fix.go:54] fixHost starting: 
	I1218 00:19:34.320643 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:19:34.337200 1177669 fix.go:112] recreateIfNeeded on functional-240845: state=Running err=<nil>
	W1218 00:19:34.337231 1177669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:19:34.340534 1177669 out.go:252] * Updating the running docker "functional-240845" container ...
	I1218 00:19:34.340583 1177669 machine.go:94] provisionDockerMachine start ...
	I1218 00:19:34.340661 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.357593 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.357953 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.357966 1177669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:19:34.511862 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.511889 1177669 ubuntu.go:182] provisioning hostname "functional-240845"
	I1218 00:19:34.511951 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.530122 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.530421 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.530437 1177669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-240845 && echo "functional-240845" | sudo tee /etc/hostname
	I1218 00:19:34.693713 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.693796 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.711115 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.711437 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.711457 1177669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-240845' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-240845/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-240845' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:19:34.868676 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:19:34.868704 1177669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:19:34.868727 1177669 ubuntu.go:190] setting up certificates
	I1218 00:19:34.868737 1177669 provision.go:84] configureAuth start
	I1218 00:19:34.868796 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:34.885386 1177669 provision.go:143] copyHostCerts
	I1218 00:19:34.885436 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885473 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:19:34.885484 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885557 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:19:34.885647 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885670 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:19:34.885675 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885701 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:19:34.885784 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885802 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:19:34.885807 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885830 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:19:34.885882 1177669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-240845 san=[127.0.0.1 192.168.49.2 functional-240845 localhost minikube]
	I1218 00:19:35.070465 1177669 provision.go:177] copyRemoteCerts
	I1218 00:19:35.070558 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:19:35.070625 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.089175 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:35.196164 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:19:35.196247 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:19:35.213266 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:19:35.213323 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:19:35.231357 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:19:35.231416 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:19:35.249293 1177669 provision.go:87] duration metric: took 380.542312ms to configureAuth
	I1218 00:19:35.249372 1177669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:19:35.249565 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:35.249673 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.267176 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:35.267503 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:35.267526 1177669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:19:40.661888 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:19:40.661918 1177669 machine.go:97] duration metric: took 6.321326566s to provisionDockerMachine
	I1218 00:19:40.661929 1177669 start.go:293] postStartSetup for "functional-240845" (driver="docker")
	I1218 00:19:40.661947 1177669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:19:40.662006 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:19:40.662069 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.679665 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.787680 1177669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:19:40.790725 1177669 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:19:40.790745 1177669 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:19:40.790750 1177669 command_runner.go:130] > VERSION_ID="12"
	I1218 00:19:40.790757 1177669 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:19:40.790762 1177669 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:19:40.790766 1177669 command_runner.go:130] > ID=debian
	I1218 00:19:40.790771 1177669 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:19:40.790776 1177669 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:19:40.790785 1177669 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:19:40.790821 1177669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:19:40.790843 1177669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:19:40.790853 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:19:40.790906 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:19:40.790988 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:19:40.791003 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:19:40.791081 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:19:40.791089 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:19:40.791141 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:19:40.798177 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:19:40.814786 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:19:40.830892 1177669 start.go:296] duration metric: took 168.948549ms for postStartSetup
	I1218 00:19:40.831030 1177669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:19:40.831082 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.848091 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.952833 1177669 command_runner.go:130] > 13%
	I1218 00:19:40.953354 1177669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:19:40.957853 1177669 command_runner.go:130] > 171G
	I1218 00:19:40.958309 1177669 fix.go:56] duration metric: took 6.637921757s for fixHost
	I1218 00:19:40.958329 1177669 start.go:83] releasing machines lock for "functional-240845", held for 6.637966499s
	I1218 00:19:40.958394 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:40.975843 1177669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:19:40.975911 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.976173 1177669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:19:40.976254 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.995610 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.013560 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.099878 1177669 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:19:41.100025 1177669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:19:41.195326 1177669 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:19:41.198525 1177669 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:19:41.198598 1177669 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:19:41.198697 1177669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:19:41.321255 1177669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:19:41.326138 1177669 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:19:41.326216 1177669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:19:41.326312 1177669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:19:41.337406 1177669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:19:41.337470 1177669 start.go:496] detecting cgroup driver to use...
	I1218 00:19:41.337517 1177669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:19:41.337604 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:19:41.364732 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:19:41.395259 1177669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:19:41.395373 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:19:41.425216 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:19:41.453795 1177669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:19:41.688599 1177669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:19:41.909163 1177669 docker.go:234] disabling docker service ...
	I1218 00:19:41.909312 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:19:41.926883 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:19:41.943387 1177669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:19:42.156451 1177669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:19:42.449825 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:19:42.467750 1177669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:19:42.493864 1177669 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:19:42.495463 1177669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:19:42.495560 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.506971 1177669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:19:42.507118 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.518977 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.530876 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.539925 1177669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:19:42.553447 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.569558 1177669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.582698 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.597525 1177669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:19:42.608606 1177669 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:19:42.609612 1177669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:19:42.617962 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:19:42.846451 1177669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:21:13.130293 1177669 ssh_runner.go:235] Completed: sudo systemctl restart crio: (1m30.283808536s)
	I1218 00:21:13.130318 1177669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:21:13.130368 1177669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:21:13.134416 1177669 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:21:13.134438 1177669 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:21:13.134453 1177669 command_runner.go:130] > Device: 0,72	Inode: 804         Links: 1
	I1218 00:21:13.134460 1177669 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:13.134465 1177669 command_runner.go:130] > Access: 2025-12-18 00:21:13.087402358 +0000
	I1218 00:21:13.134471 1177669 command_runner.go:130] > Modify: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134475 1177669 command_runner.go:130] > Change: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134479 1177669 command_runner.go:130] >  Birth: -
	I1218 00:21:13.134836 1177669 start.go:564] Will wait 60s for crictl version
	I1218 00:21:13.134895 1177669 ssh_runner.go:195] Run: which crictl
	I1218 00:21:13.138647 1177669 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:21:13.138725 1177669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:21:13.167266 1177669 command_runner.go:130] > Version:  0.1.0
	I1218 00:21:13.167284 1177669 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:21:13.167289 1177669 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:21:13.167294 1177669 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:21:13.169251 1177669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:21:13.169347 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.194596 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.194618 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.194624 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.194629 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.194634 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.194639 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.194643 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.194656 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.194660 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.194671 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.194674 1177669 command_runner.go:130] >      static
	I1218 00:21:13.194678 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.194682 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.194686 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.194689 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.194693 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.194697 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.194701 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.194705 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.194709 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.196349 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.221274 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.221297 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.221302 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.221308 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.221313 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.221318 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.221321 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.221326 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.221331 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.221334 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.221338 1177669 command_runner.go:130] >      static
	I1218 00:21:13.221341 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.221345 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.221350 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.221353 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.221357 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.221360 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.221364 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.221369 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.221373 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.226046 1177669 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:21:13.228983 1177669 cli_runner.go:164] Run: docker network inspect functional-240845 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:21:13.244579 1177669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:21:13.248178 1177669 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:21:13.248440 1177669 kubeadm.go:884] updating cluster {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA API
ServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fal
se DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:21:13.248553 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:21:13.248613 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.282229 1177669 command_runner.go:130] > {
	I1218 00:21:13.282251 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.282256 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282265 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.282269 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282275 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.282279 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282283 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282294 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.282305 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.282308 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282313 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.282332 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282342 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282346 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282350 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282356 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.282364 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282370 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.282373 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282378 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282389 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.282398 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.282403 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282408 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.282415 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282422 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282426 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282434 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282444 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.282449 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282454 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.282462 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282466 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282475 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.282483 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.282491 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282495 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.282499 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282503 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282506 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282509 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282516 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.282523 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282528 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.282532 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282536 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282549 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.282557 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.282564 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282569 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.282578 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.282586 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282589 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282592 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282599 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.282606 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282611 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.282615 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282624 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282631 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.282643 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.282647 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282651 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.282658 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282661 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282665 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282669 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282676 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282680 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282698 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282709 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.282714 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282719 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.282726 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282729 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282737 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.282746 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.282751 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282755 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.282759 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282765 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282769 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282777 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282782 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282785 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282788 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282795 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.282802 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282807 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.282811 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282815 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282828 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.282836 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.282843 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282850 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.282853 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282862 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282865 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282869 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282873 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282883 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282887 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282894 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.282902 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282907 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.282910 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282913 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282922 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.282941 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.282952 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282957 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.282967 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282970 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282973 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282976 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282984 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.282999 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283004 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.283007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283010 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283018 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.283026 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.283029 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283036 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.283040 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283046 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.283054 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283061 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283065 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.283067 1177669 command_runner.go:130] >     },
	I1218 00:21:13.283071 1177669 command_runner.go:130] >     {
	I1218 00:21:13.283079 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.283084 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283089 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.283092 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283099 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283107 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.283116 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.283122 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283126 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.283129 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283133 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.283136 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283144 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283148 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.283155 1177669 command_runner.go:130] >     }
	I1218 00:21:13.283158 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.283161 1177669 command_runner.go:130] > }
	I1218 00:21:13.283336 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.283347 1177669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:21:13.283410 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.307800 1177669 command_runner.go:130] > {
	I1218 00:21:13.307819 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.307823 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307831 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.307836 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307841 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.307845 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307849 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307861 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.307869 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.307872 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307877 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.307881 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307886 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307889 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307893 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307899 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.307903 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307909 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.307912 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307921 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307929 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.307940 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.307943 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307947 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.307951 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307959 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307962 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307965 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307971 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.307975 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307980 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.307983 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307987 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307995 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.308003 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.308007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308011 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.308015 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308020 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308023 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308026 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308032 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.308036 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308042 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.308045 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308049 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308057 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.308065 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.308068 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308072 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.308080 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.308084 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308087 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308090 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308099 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.308103 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308108 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.308111 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308114 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308122 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.308129 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.308132 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308136 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.308140 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308143 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308146 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308149 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308153 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308156 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308159 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308165 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.308168 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308173 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.308176 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308180 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308188 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.308195 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.308198 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308202 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.308206 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308210 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308213 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308217 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308241 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308244 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308247 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308253 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.308262 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308269 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.308275 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308279 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308287 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.308295 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.308298 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308302 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.308306 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308309 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308312 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308316 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308319 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308323 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308325 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308332 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.308335 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308340 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.308343 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308347 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308354 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.308370 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.308374 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308377 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.308381 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308385 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308387 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308390 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308397 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.308400 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308405 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.308408 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308412 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308422 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.308430 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.308433 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308437 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.308440 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308444 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308447 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308450 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308455 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308458 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308461 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308468 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.308472 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308477 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.308480 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308484 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308491 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.308498 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.308501 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308505 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.308508 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308512 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.308515 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308518 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308522 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.308524 1177669 command_runner.go:130] >     }
	I1218 00:21:13.308527 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.308529 1177669 command_runner.go:130] > }
	I1218 00:21:13.310403 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.310424 1177669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:21:13.310432 1177669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.34.3 crio true true} ...
	I1218 00:21:13.310536 1177669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-240845 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:21:13.310619 1177669 ssh_runner.go:195] Run: crio config
	I1218 00:21:13.358161 1177669 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:21:13.358186 1177669 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:21:13.358194 1177669 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:21:13.358198 1177669 command_runner.go:130] > #
	I1218 00:21:13.358205 1177669 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:21:13.358212 1177669 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:21:13.358218 1177669 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:21:13.358229 1177669 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:21:13.358236 1177669 command_runner.go:130] > # reload'.
	I1218 00:21:13.358243 1177669 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:21:13.358250 1177669 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:21:13.358258 1177669 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:21:13.358264 1177669 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:21:13.358267 1177669 command_runner.go:130] > [crio]
	I1218 00:21:13.358273 1177669 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:21:13.358277 1177669 command_runner.go:130] > # containers images, in this directory.
	I1218 00:21:13.358820 1177669 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:21:13.358837 1177669 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:21:13.359435 1177669 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:21:13.359448 1177669 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:21:13.359935 1177669 command_runner.go:130] > # imagestore = ""
	I1218 00:21:13.359950 1177669 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:21:13.359963 1177669 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:21:13.360646 1177669 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:21:13.360660 1177669 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:21:13.360667 1177669 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:21:13.360961 1177669 command_runner.go:130] > # storage_option = [
	I1218 00:21:13.361308 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.361321 1177669 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:21:13.361334 1177669 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:21:13.361921 1177669 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:21:13.361934 1177669 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:21:13.361949 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:21:13.361954 1177669 command_runner.go:130] > # always happen on a node reboot
	I1218 00:21:13.362559 1177669 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:21:13.362583 1177669 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:21:13.362590 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:21:13.362595 1177669 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:21:13.363052 1177669 command_runner.go:130] > # version_file_persist = ""
	I1218 00:21:13.363067 1177669 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:21:13.363076 1177669 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:21:13.363680 1177669 command_runner.go:130] > # internal_wipe = true
	I1218 00:21:13.363702 1177669 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:21:13.363709 1177669 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:21:13.364349 1177669 command_runner.go:130] > # internal_repair = true
	I1218 00:21:13.364361 1177669 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:21:13.364368 1177669 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:21:13.364377 1177669 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:21:13.364926 1177669 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:21:13.364942 1177669 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:21:13.364946 1177669 command_runner.go:130] > [crio.api]
	I1218 00:21:13.364951 1177669 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:21:13.365581 1177669 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:21:13.365594 1177669 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:21:13.367685 1177669 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:21:13.367700 1177669 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:21:13.367706 1177669 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:21:13.367710 1177669 command_runner.go:130] > # stream_port = "0"
	I1218 00:21:13.367716 1177669 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:21:13.367723 1177669 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:21:13.367730 1177669 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:21:13.367745 1177669 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:21:13.367752 1177669 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:21:13.367762 1177669 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367766 1177669 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:21:13.367773 1177669 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:21:13.367780 1177669 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367784 1177669 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:21:13.367791 1177669 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:21:13.367802 1177669 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:21:13.367808 1177669 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:21:13.367814 1177669 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:21:13.367835 1177669 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367844 1177669 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:21:13.367853 1177669 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367861 1177669 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:21:13.367868 1177669 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:21:13.367879 1177669 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:21:13.367883 1177669 command_runner.go:130] > [crio.runtime]
	I1218 00:21:13.367893 1177669 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:21:13.367904 1177669 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:21:13.367916 1177669 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:21:13.367926 1177669 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:21:13.367934 1177669 command_runner.go:130] > # default_ulimits = [
	I1218 00:21:13.367937 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.367950 1177669 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:21:13.367958 1177669 command_runner.go:130] > # no_pivot = false
	I1218 00:21:13.367963 1177669 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:21:13.367974 1177669 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:21:13.367979 1177669 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:21:13.367988 1177669 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:21:13.367994 1177669 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:21:13.368004 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368012 1177669 command_runner.go:130] > # conmon = ""
	I1218 00:21:13.368015 1177669 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:21:13.368023 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:21:13.368026 1177669 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:21:13.368035 1177669 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:21:13.368044 1177669 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:21:13.368051 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368058 1177669 command_runner.go:130] > # conmon_env = [
	I1218 00:21:13.368061 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368070 1177669 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:21:13.368076 1177669 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:21:13.368084 1177669 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:21:13.368089 1177669 command_runner.go:130] > # default_env = [
	I1218 00:21:13.368092 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368098 1177669 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:21:13.368111 1177669 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:21:13.368119 1177669 command_runner.go:130] > # selinux = false
	I1218 00:21:13.368125 1177669 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:21:13.368136 1177669 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:21:13.368144 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368148 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.368159 1177669 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:21:13.368167 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368171 1177669 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:21:13.368178 1177669 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:21:13.368189 1177669 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:21:13.368199 1177669 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:21:13.368206 1177669 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:21:13.368212 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368217 1177669 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:21:13.368256 1177669 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:21:13.368261 1177669 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:21:13.368266 1177669 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:21:13.368280 1177669 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:21:13.368287 1177669 command_runner.go:130] > # blockio parameters.
	I1218 00:21:13.368292 1177669 command_runner.go:130] > # blockio_reload = false
	I1218 00:21:13.368298 1177669 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:21:13.368303 1177669 command_runner.go:130] > # irqbalance daemon.
	I1218 00:21:13.368311 1177669 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:21:13.368320 1177669 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:21:13.368327 1177669 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:21:13.368337 1177669 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:21:13.368347 1177669 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:21:13.368357 1177669 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:21:13.368365 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368369 1177669 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:21:13.368375 1177669 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:21:13.368382 1177669 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:21:13.368388 1177669 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:21:13.368396 1177669 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:21:13.368402 1177669 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:21:13.368412 1177669 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:21:13.368419 1177669 command_runner.go:130] > # will be added.
	I1218 00:21:13.368423 1177669 command_runner.go:130] > # default_capabilities = [
	I1218 00:21:13.368430 1177669 command_runner.go:130] > # 	"CHOWN",
	I1218 00:21:13.368434 1177669 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:21:13.368442 1177669 command_runner.go:130] > # 	"FSETID",
	I1218 00:21:13.368445 1177669 command_runner.go:130] > # 	"FOWNER",
	I1218 00:21:13.368457 1177669 command_runner.go:130] > # 	"SETGID",
	I1218 00:21:13.368461 1177669 command_runner.go:130] > # 	"SETUID",
	I1218 00:21:13.368479 1177669 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:21:13.368487 1177669 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:21:13.368490 1177669 command_runner.go:130] > # 	"KILL",
	I1218 00:21:13.368494 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368506 1177669 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:21:13.368515 1177669 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:21:13.368524 1177669 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:21:13.368531 1177669 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:21:13.368539 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368542 1177669 command_runner.go:130] > default_sysctls = [
	I1218 00:21:13.368547 1177669 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:21:13.368554 1177669 command_runner.go:130] > ]
	I1218 00:21:13.368563 1177669 command_runner.go:130] > # List of devices on the host that a
	I1218 00:21:13.368570 1177669 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:21:13.368577 1177669 command_runner.go:130] > # allowed_devices = [
	I1218 00:21:13.368580 1177669 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:21:13.368588 1177669 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:21:13.368594 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368603 1177669 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:21:13.368611 1177669 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:21:13.368618 1177669 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:21:13.368624 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368628 1177669 command_runner.go:130] > # additional_devices = [
	I1218 00:21:13.368633 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368639 1177669 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:21:13.368646 1177669 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:21:13.368649 1177669 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:21:13.368653 1177669 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:21:13.368664 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368673 1177669 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:21:13.368683 1177669 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:21:13.368701 1177669 command_runner.go:130] > # Defaults to false.
	I1218 00:21:13.368712 1177669 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:21:13.368719 1177669 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:21:13.368725 1177669 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:21:13.368734 1177669 command_runner.go:130] > # hooks_dir = [
	I1218 00:21:13.368739 1177669 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:21:13.368745 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368751 1177669 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:21:13.368761 1177669 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:21:13.368770 1177669 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:21:13.368773 1177669 command_runner.go:130] > #
	I1218 00:21:13.368780 1177669 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:21:13.368789 1177669 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:21:13.368795 1177669 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:21:13.368803 1177669 command_runner.go:130] > #
	I1218 00:21:13.368809 1177669 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:21:13.368818 1177669 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:21:13.368829 1177669 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:21:13.368846 1177669 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:21:13.368853 1177669 command_runner.go:130] > #
	I1218 00:21:13.368857 1177669 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:21:13.368866 1177669 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:21:13.368876 1177669 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:21:13.368880 1177669 command_runner.go:130] > # pids_limit = -1
	I1218 00:21:13.368886 1177669 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:21:13.368894 1177669 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:21:13.368904 1177669 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:21:13.368917 1177669 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:21:13.368923 1177669 command_runner.go:130] > # log_size_max = -1
	I1218 00:21:13.368931 1177669 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:21:13.368938 1177669 command_runner.go:130] > # log_to_journald = false
	I1218 00:21:13.368944 1177669 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:21:13.368949 1177669 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:21:13.368959 1177669 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:21:13.368968 1177669 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:21:13.368974 1177669 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:21:13.368981 1177669 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:21:13.368986 1177669 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:21:13.368993 1177669 command_runner.go:130] > # read_only = false
	I1218 00:21:13.369000 1177669 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:21:13.369009 1177669 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:21:13.369017 1177669 command_runner.go:130] > # live configuration reload.
	I1218 00:21:13.369020 1177669 command_runner.go:130] > # log_level = "info"
	I1218 00:21:13.369026 1177669 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:21:13.369031 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.369036 1177669 command_runner.go:130] > # log_filter = ""
	I1218 00:21:13.369043 1177669 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369052 1177669 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:21:13.369056 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369067 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369074 1177669 command_runner.go:130] > # uid_mappings = ""
	I1218 00:21:13.369084 1177669 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369093 1177669 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:21:13.369097 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369105 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369114 1177669 command_runner.go:130] > # gid_mappings = ""
	I1218 00:21:13.369120 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:21:13.369127 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369139 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369150 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369158 1177669 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:21:13.369165 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:21:13.369174 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369184 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369192 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369196 1177669 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:21:13.369208 1177669 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:21:13.369218 1177669 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:21:13.369224 1177669 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:21:13.369231 1177669 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:21:13.369238 1177669 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:21:13.369247 1177669 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:21:13.369256 1177669 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:21:13.369261 1177669 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:21:13.369265 1177669 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:21:13.369273 1177669 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:21:13.369279 1177669 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:21:13.369286 1177669 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:21:13.369293 1177669 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:21:13.369301 1177669 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:21:13.369310 1177669 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:21:13.369320 1177669 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:21:13.369326 1177669 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:21:13.369333 1177669 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:21:13.369339 1177669 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:21:13.369347 1177669 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:21:13.369351 1177669 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:21:13.369359 1177669 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:21:13.369363 1177669 command_runner.go:130] > # pinns_path = ""
	I1218 00:21:13.369368 1177669 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:21:13.369378 1177669 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:21:13.369382 1177669 command_runner.go:130] > # enable_criu_support = true
	I1218 00:21:13.369390 1177669 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:21:13.369400 1177669 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:21:13.369407 1177669 command_runner.go:130] > # enable_pod_events = false
	I1218 00:21:13.369414 1177669 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:21:13.369422 1177669 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:21:13.369426 1177669 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:21:13.369431 1177669 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:21:13.369443 1177669 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:21:13.369457 1177669 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:21:13.369465 1177669 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:21:13.369474 1177669 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:21:13.369481 1177669 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:21:13.369486 1177669 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:21:13.369492 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.369499 1177669 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:21:13.369509 1177669 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:21:13.369515 1177669 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:21:13.369521 1177669 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:21:13.369523 1177669 command_runner.go:130] > #
	I1218 00:21:13.369528 1177669 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:21:13.369536 1177669 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:21:13.369540 1177669 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:21:13.369548 1177669 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:21:13.369553 1177669 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:21:13.369561 1177669 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:21:13.369565 1177669 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:21:13.369574 1177669 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:21:13.369577 1177669 command_runner.go:130] > # monitor_env = []
	I1218 00:21:13.369585 1177669 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:21:13.369590 1177669 command_runner.go:130] > # allowed_annotations = []
	I1218 00:21:13.369595 1177669 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:21:13.369599 1177669 command_runner.go:130] > # no_sync_log = false
	I1218 00:21:13.369603 1177669 command_runner.go:130] > # default_annotations = {}
	I1218 00:21:13.369611 1177669 command_runner.go:130] > # stream_websockets = false
	I1218 00:21:13.369614 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.369664 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.369673 1177669 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:21:13.369680 1177669 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:21:13.369686 1177669 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:21:13.369697 1177669 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:21:13.369708 1177669 command_runner.go:130] > #   in $PATH.
	I1218 00:21:13.369718 1177669 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:21:13.369728 1177669 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:21:13.369735 1177669 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:21:13.369741 1177669 command_runner.go:130] > #   state.
	I1218 00:21:13.369747 1177669 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:21:13.369753 1177669 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:21:13.369759 1177669 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:21:13.369765 1177669 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:21:13.369774 1177669 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:21:13.369780 1177669 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:21:13.369789 1177669 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:21:13.369795 1177669 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:21:13.369805 1177669 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:21:13.369813 1177669 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:21:13.369820 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:21:13.369831 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:21:13.370100 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:21:13.370120 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:21:13.370129 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:21:13.370143 1177669 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:21:13.370151 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:21:13.370162 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:21:13.370169 1177669 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:21:13.370176 1177669 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:21:13.370187 1177669 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:21:13.370195 1177669 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:21:13.370206 1177669 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:21:13.370213 1177669 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:21:13.370219 1177669 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:21:13.370232 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:21:13.370239 1177669 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:21:13.370249 1177669 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:21:13.370266 1177669 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:21:13.370271 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:21:13.370283 1177669 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:21:13.370288 1177669 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:21:13.370295 1177669 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:21:13.370305 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:21:13.370313 1177669 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:21:13.370317 1177669 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:21:13.370329 1177669 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:21:13.370338 1177669 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:21:13.370350 1177669 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:21:13.370357 1177669 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:21:13.370367 1177669 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:21:13.370375 1177669 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:21:13.370388 1177669 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:21:13.370395 1177669 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:21:13.370408 1177669 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:21:13.370420 1177669 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:21:13.370425 1177669 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:21:13.370437 1177669 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:21:13.370445 1177669 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:21:13.370459 1177669 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:21:13.370466 1177669 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:21:13.370473 1177669 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:21:13.370485 1177669 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:21:13.370488 1177669 command_runner.go:130] > #
	I1218 00:21:13.370493 1177669 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:21:13.370496 1177669 command_runner.go:130] > #
	I1218 00:21:13.370506 1177669 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:21:13.370513 1177669 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:21:13.370516 1177669 command_runner.go:130] > #
	I1218 00:21:13.370525 1177669 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:21:13.370537 1177669 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:21:13.370545 1177669 command_runner.go:130] > #
	I1218 00:21:13.370553 1177669 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:21:13.370556 1177669 command_runner.go:130] > # feature.
	I1218 00:21:13.370563 1177669 command_runner.go:130] > #
	I1218 00:21:13.370569 1177669 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:21:13.370576 1177669 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:21:13.370587 1177669 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:21:13.370594 1177669 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:21:13.370600 1177669 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:21:13.370610 1177669 command_runner.go:130] > #
	I1218 00:21:13.370618 1177669 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:21:13.370625 1177669 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:21:13.370628 1177669 command_runner.go:130] > #
	I1218 00:21:13.370638 1177669 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:21:13.370644 1177669 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:21:13.370647 1177669 command_runner.go:130] > #
	I1218 00:21:13.370657 1177669 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:21:13.370664 1177669 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:21:13.370667 1177669 command_runner.go:130] > # limitation.
	I1218 00:21:13.370672 1177669 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:21:13.370680 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:21:13.370684 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.370688 1177669 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:21:13.370695 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.370699 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371091 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371100 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371106 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371111 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371151 1177669 command_runner.go:130] > allowed_annotations = [
	I1218 00:21:13.371159 1177669 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:21:13.371163 1177669 command_runner.go:130] > ]
	I1218 00:21:13.371167 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371172 1177669 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:21:13.371180 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:21:13.371184 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.371188 1177669 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:21:13.371224 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.371229 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371233 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371242 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371253 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371257 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371263 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371305 1177669 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:21:13.371314 1177669 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:21:13.371321 1177669 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:21:13.371342 1177669 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:21:13.371388 1177669 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:21:13.371402 1177669 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:21:13.371414 1177669 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:21:13.371421 1177669 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:21:13.371470 1177669 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:21:13.371479 1177669 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:21:13.371490 1177669 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:21:13.371528 1177669 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:21:13.371536 1177669 command_runner.go:130] > # Example:
	I1218 00:21:13.371546 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:21:13.371551 1177669 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:21:13.371556 1177669 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:21:13.371561 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:21:13.371569 1177669 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:21:13.371573 1177669 command_runner.go:130] > # cpushares = "5"
	I1218 00:21:13.371606 1177669 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:21:13.371613 1177669 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:21:13.371617 1177669 command_runner.go:130] > # cpulimit = "35"
	I1218 00:21:13.371620 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.371629 1177669 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:21:13.371636 1177669 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:21:13.371647 1177669 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:21:13.371690 1177669 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:21:13.371702 1177669 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:21:13.371713 1177669 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:21:13.371718 1177669 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:21:13.371726 1177669 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:21:13.371777 1177669 command_runner.go:130] > # Default value is set to true
	I1218 00:21:13.371785 1177669 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:21:13.371791 1177669 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:21:13.371796 1177669 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:21:13.371805 1177669 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:21:13.371846 1177669 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:21:13.371855 1177669 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:21:13.371869 1177669 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:21:13.371873 1177669 command_runner.go:130] > # timezone = ""
	I1218 00:21:13.371880 1177669 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:21:13.371883 1177669 command_runner.go:130] > #
	I1218 00:21:13.371923 1177669 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:21:13.371933 1177669 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:21:13.371937 1177669 command_runner.go:130] > [crio.image]
	I1218 00:21:13.371948 1177669 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:21:13.371953 1177669 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:21:13.371960 1177669 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:21:13.372001 1177669 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372008 1177669 command_runner.go:130] > # global_auth_file = ""
	I1218 00:21:13.372014 1177669 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:21:13.372020 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372029 1177669 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.372036 1177669 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:21:13.372043 1177669 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372052 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372057 1177669 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:21:13.372094 1177669 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:21:13.372111 1177669 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:21:13.372119 1177669 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:21:13.372125 1177669 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:21:13.372134 1177669 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:21:13.372140 1177669 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:21:13.372147 1177669 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:21:13.372187 1177669 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:21:13.372197 1177669 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:21:13.372204 1177669 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:21:13.372215 1177669 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:21:13.372270 1177669 command_runner.go:130] > # pinned_images = [
	I1218 00:21:13.372283 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372290 1177669 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:21:13.372301 1177669 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:21:13.372308 1177669 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:21:13.372319 1177669 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:21:13.372324 1177669 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:21:13.372362 1177669 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:21:13.372371 1177669 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:21:13.372384 1177669 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:21:13.372391 1177669 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:21:13.372402 1177669 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:21:13.372408 1177669 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:21:13.372414 1177669 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:21:13.372450 1177669 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:21:13.372460 1177669 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:21:13.372464 1177669 command_runner.go:130] > # changing them here.
	I1218 00:21:13.372475 1177669 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:21:13.372479 1177669 command_runner.go:130] > # insecure_registries = [
	I1218 00:21:13.372482 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372489 1177669 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:21:13.372498 1177669 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:21:13.372502 1177669 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:21:13.372541 1177669 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:21:13.372549 1177669 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:21:13.372559 1177669 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:21:13.372567 1177669 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:21:13.372572 1177669 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:21:13.372582 1177669 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:21:13.372591 1177669 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:21:13.372630 1177669 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:21:13.372638 1177669 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:21:13.372643 1177669 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:21:13.372650 1177669 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:21:13.372667 1177669 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:21:13.372672 1177669 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:21:13.372678 1177669 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:21:13.372721 1177669 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:21:13.372730 1177669 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:21:13.372735 1177669 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:21:13.372746 1177669 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:21:13.372750 1177669 command_runner.go:130] > # CNI plugins.
	I1218 00:21:13.372753 1177669 command_runner.go:130] > [crio.network]
	I1218 00:21:13.372759 1177669 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:21:13.372769 1177669 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:21:13.372773 1177669 command_runner.go:130] > # cni_default_network = ""
	I1218 00:21:13.372780 1177669 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:21:13.372837 1177669 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:21:13.372851 1177669 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:21:13.372856 1177669 command_runner.go:130] > # plugin_dirs = [
	I1218 00:21:13.372860 1177669 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:21:13.372863 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372867 1177669 command_runner.go:130] > # List of included pod metrics.
	I1218 00:21:13.372903 1177669 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:21:13.372909 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372923 1177669 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:21:13.372927 1177669 command_runner.go:130] > [crio.metrics]
	I1218 00:21:13.372933 1177669 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:21:13.372941 1177669 command_runner.go:130] > # enable_metrics = false
	I1218 00:21:13.372946 1177669 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:21:13.372951 1177669 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:21:13.372958 1177669 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:21:13.372999 1177669 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:21:13.373006 1177669 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:21:13.373010 1177669 command_runner.go:130] > # metrics_collectors = [
	I1218 00:21:13.373018 1177669 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:21:13.373023 1177669 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:21:13.373033 1177669 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:21:13.373037 1177669 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:21:13.373042 1177669 command_runner.go:130] > # 	"operations_total",
	I1218 00:21:13.373077 1177669 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:21:13.373084 1177669 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:21:13.373089 1177669 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:21:13.373093 1177669 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:21:13.373098 1177669 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:21:13.373106 1177669 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:21:13.373111 1177669 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:21:13.373115 1177669 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:21:13.373120 1177669 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:21:13.373133 1177669 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:21:13.373167 1177669 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:21:13.373176 1177669 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:21:13.373179 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.373190 1177669 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:21:13.373199 1177669 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:21:13.373205 1177669 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:21:13.373209 1177669 command_runner.go:130] > # metrics_port = 9090
	I1218 00:21:13.373214 1177669 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:21:13.373222 1177669 command_runner.go:130] > # metrics_socket = ""
	I1218 00:21:13.373425 1177669 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:21:13.373436 1177669 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:21:13.373448 1177669 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:21:13.373454 1177669 command_runner.go:130] > # certificate on any modification event.
	I1218 00:21:13.373457 1177669 command_runner.go:130] > # metrics_cert = ""
	I1218 00:21:13.373463 1177669 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:21:13.373472 1177669 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:21:13.373475 1177669 command_runner.go:130] > # metrics_key = ""
	I1218 00:21:13.373510 1177669 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:21:13.373518 1177669 command_runner.go:130] > [crio.tracing]
	I1218 00:21:13.373528 1177669 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:21:13.373538 1177669 command_runner.go:130] > # enable_tracing = false
	I1218 00:21:13.373545 1177669 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:21:13.373549 1177669 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:21:13.373560 1177669 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:21:13.373565 1177669 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:21:13.373569 1177669 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:21:13.373602 1177669 command_runner.go:130] > [crio.nri]
	I1218 00:21:13.373606 1177669 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:21:13.373614 1177669 command_runner.go:130] > # enable_nri = true
	I1218 00:21:13.373618 1177669 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:21:13.373623 1177669 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:21:13.373628 1177669 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:21:13.373632 1177669 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:21:13.373641 1177669 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:21:13.373646 1177669 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:21:13.373652 1177669 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:21:13.374323 1177669 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:21:13.374347 1177669 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:21:13.374353 1177669 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:21:13.374359 1177669 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:21:13.374369 1177669 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:21:13.374374 1177669 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:21:13.374384 1177669 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:21:13.374396 1177669 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:21:13.374400 1177669 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:21:13.374404 1177669 command_runner.go:130] > # - OCI hook injection
	I1218 00:21:13.374410 1177669 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:21:13.374419 1177669 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:21:13.374424 1177669 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:21:13.374429 1177669 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:21:13.374440 1177669 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:21:13.374447 1177669 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:21:13.374453 1177669 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:21:13.374461 1177669 command_runner.go:130] > #
	I1218 00:21:13.374470 1177669 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:21:13.374475 1177669 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:21:13.374481 1177669 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:21:13.374487 1177669 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:21:13.374497 1177669 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:21:13.374503 1177669 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:21:13.374508 1177669 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:21:13.374517 1177669 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:21:13.374520 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.374526 1177669 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:21:13.374532 1177669 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:21:13.374540 1177669 command_runner.go:130] > [crio.stats]
	I1218 00:21:13.374546 1177669 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:21:13.374552 1177669 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:21:13.374557 1177669 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:21:13.374567 1177669 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:21:13.374574 1177669 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:21:13.374578 1177669 command_runner.go:130] > # collection_period = 0
	I1218 00:21:13.375235 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337716712Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:21:13.375252 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337755529Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:21:13.375261 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337787676Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:21:13.375269 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337813217Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:21:13.375279 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337887603Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:21:13.375295 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.338323059Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:21:13.375307 1177669 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:21:13.375636 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:21:13.375654 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:21:13.375670 1177669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:21:13.375692 1177669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-240845 NodeName:functional-240845 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:21:13.375818 1177669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-240845"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:21:13.375897 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:21:13.382943 1177669 command_runner.go:130] > kubeadm
	I1218 00:21:13.382987 1177669 command_runner.go:130] > kubectl
	I1218 00:21:13.382992 1177669 command_runner.go:130] > kubelet
	I1218 00:21:13.383228 1177669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:21:13.383323 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:21:13.390563 1177669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (367 bytes)
	I1218 00:21:13.402469 1177669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:21:13.415695 1177669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2214 bytes)
	I1218 00:21:13.427935 1177669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:21:13.431432 1177669 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:21:13.431528 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:13.573724 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:13.587283 1177669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845 for IP: 192.168.49.2
	I1218 00:21:13.587308 1177669 certs.go:195] generating shared ca certs ...
	I1218 00:21:13.587325 1177669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:13.587468 1177669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:21:13.587523 1177669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:21:13.587535 1177669 certs.go:257] generating profile certs ...
	I1218 00:21:13.587627 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key
	I1218 00:21:13.587682 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key.83c30509
	I1218 00:21:13.587749 1177669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key
	I1218 00:21:13.587763 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:21:13.587778 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:21:13.587791 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:21:13.587807 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:21:13.587827 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:21:13.587840 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:21:13.587855 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:21:13.587866 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:21:13.587928 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:21:13.587965 1177669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:21:13.587976 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:21:13.588004 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:21:13.588031 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:21:13.588058 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:21:13.588108 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:21:13.588142 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.588156 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.588167 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.588757 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:21:13.607287 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:21:13.626005 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:21:13.643497 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:21:13.660653 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:21:13.677616 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1218 00:21:13.694313 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:21:13.711161 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:21:13.728011 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:21:13.745006 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:21:13.761771 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:21:13.778664 1177669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:21:13.791171 1177669 ssh_runner.go:195] Run: openssl version
	I1218 00:21:13.796833 1177669 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:21:13.797285 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.804618 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:21:13.812913 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816610 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816655 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816704 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.857240 1177669 command_runner.go:130] > 51391683
	I1218 00:21:13.857318 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:21:13.864756 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.871981 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:21:13.879459 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883023 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883055 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883126 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.923479 1177669 command_runner.go:130] > 3ec20f2e
	I1218 00:21:13.923967 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:21:13.931505 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.938743 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:21:13.946369 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950234 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950276 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950327 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.990419 1177669 command_runner.go:130] > b5213941
	I1218 00:21:13.990837 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:21:13.998401 1177669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003376 1177669 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003402 1177669 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:21:14.003409 1177669 command_runner.go:130] > Device: 259,1	Inode: 1327743     Links: 1
	I1218 00:21:14.003416 1177669 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:14.003422 1177669 command_runner.go:130] > Access: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003427 1177669 command_runner.go:130] > Modify: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003432 1177669 command_runner.go:130] > Change: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003438 1177669 command_runner.go:130] >  Birth: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003512 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:21:14.045243 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.045691 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:21:14.086658 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.086738 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:21:14.127897 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.128372 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:21:14.168626 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.169131 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:21:14.209194 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.209712 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:21:14.250333 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.250470 1177669 kubeadm.go:401] StartCluster: {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISer
verNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false
DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:21:14.250558 1177669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:21:14.250623 1177669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:21:14.277777 1177669 command_runner.go:130] > e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563
	I1218 00:21:14.277803 1177669 command_runner.go:130] > 0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c
	I1218 00:21:14.277811 1177669 command_runner.go:130] > 95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e
	I1218 00:21:14.277820 1177669 command_runner.go:130] > 1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5
	I1218 00:21:14.277826 1177669 command_runner.go:130] > 9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1
	I1218 00:21:14.277832 1177669 command_runner.go:130] > 3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad
	I1218 00:21:14.277838 1177669 command_runner.go:130] > cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15
	I1218 00:21:14.277846 1177669 command_runner.go:130] > 38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68
	I1218 00:21:14.277857 1177669 command_runner.go:130] > 1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b
	I1218 00:21:14.277868 1177669 command_runner.go:130] > 61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a
	I1218 00:21:14.277874 1177669 command_runner.go:130] > 98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe
	I1218 00:21:14.277883 1177669 command_runner.go:130] > 891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b
	I1218 00:21:14.277889 1177669 command_runner.go:130] > 2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de
	I1218 00:21:14.277898 1177669 command_runner.go:130] > b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06
	I1218 00:21:14.280281 1177669 cri.go:89] found id: "e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563"
	I1218 00:21:14.280303 1177669 cri.go:89] found id: "0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c"
	I1218 00:21:14.280308 1177669 cri.go:89] found id: "95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e"
	I1218 00:21:14.280312 1177669 cri.go:89] found id: "1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5"
	I1218 00:21:14.280315 1177669 cri.go:89] found id: "9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1"
	I1218 00:21:14.280319 1177669 cri.go:89] found id: "3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad"
	I1218 00:21:14.280323 1177669 cri.go:89] found id: "cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15"
	I1218 00:21:14.280326 1177669 cri.go:89] found id: "38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68"
	I1218 00:21:14.280329 1177669 cri.go:89] found id: "1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b"
	I1218 00:21:14.280337 1177669 cri.go:89] found id: "61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a"
	I1218 00:21:14.280343 1177669 cri.go:89] found id: "98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe"
	I1218 00:21:14.280347 1177669 cri.go:89] found id: "891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b"
	I1218 00:21:14.280355 1177669 cri.go:89] found id: "2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	I1218 00:21:14.280359 1177669 cri.go:89] found id: "b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06"
	I1218 00:21:14.280362 1177669 cri.go:89] found id: ""
	I1218 00:21:14.280415 1177669 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:21:14.291297 1177669 command_runner.go:130] ! time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	W1218 00:21:14.291357 1177669 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	I1218 00:21:14.291439 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:21:14.298396 1177669 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:21:14.298416 1177669 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:21:14.298422 1177669 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:21:14.298426 1177669 command_runner.go:130] > member
	I1218 00:21:14.299333 1177669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:21:14.299377 1177669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:21:14.299453 1177669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:21:14.306750 1177669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:21:14.307329 1177669 kubeconfig.go:125] found "functional-240845" server: "https://192.168.49.2:8441"
	I1218 00:21:14.308688 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.308922 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.310273 1177669 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:21:14.310295 1177669 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:21:14.310301 1177669 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:21:14.310306 1177669 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:21:14.310311 1177669 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:21:14.310598 1177669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:21:14.310964 1177669 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:21:14.321005 1177669 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:21:14.321038 1177669 kubeadm.go:602] duration metric: took 21.641512ms to restartPrimaryControlPlane
	I1218 00:21:14.321068 1177669 kubeadm.go:403] duration metric: took 70.601924ms to StartCluster
	I1218 00:21:14.321095 1177669 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.321175 1177669 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.321832 1177669 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.322054 1177669 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:21:14.322232 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:21:14.322270 1177669 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:21:14.322334 1177669 addons.go:70] Setting storage-provisioner=true in profile "functional-240845"
	I1218 00:21:14.322346 1177669 addons.go:239] Setting addon storage-provisioner=true in "functional-240845"
	W1218 00:21:14.322351 1177669 addons.go:248] addon storage-provisioner should already be in state true
	I1218 00:21:14.322373 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.322797 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.323222 1177669 addons.go:70] Setting default-storageclass=true in profile "functional-240845"
	I1218 00:21:14.323243 1177669 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-240845"
	I1218 00:21:14.323528 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.326222 1177669 out.go:179] * Verifying Kubernetes components...
	I1218 00:21:14.329298 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:14.352407 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.352567 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.353875 1177669 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:21:14.354521 1177669 addons.go:239] Setting addon default-storageclass=true in "functional-240845"
	W1218 00:21:14.354541 1177669 addons.go:248] addon default-storageclass should already be in state true
	I1218 00:21:14.354568 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.355010 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.357054 1177669 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.357084 1177669 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:21:14.357149 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.385892 1177669 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.385914 1177669 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:21:14.385974 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.412313 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.438252 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.538332 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:14.555412 1177669 node_ready.go:35] waiting up to 6m0s for node "functional-240845" to be "Ready" ...
	I1218 00:21:14.556627 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.558515 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:14.558665 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:14.559006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:14.569919 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.635955 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.636102 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.636146 1177669 retry.go:31] will retry after 274.076226ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.646979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.650760 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.650797 1177669 retry.go:31] will retry after 360.821893ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.911221 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.974464 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.974555 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.974595 1177669 retry.go:31] will retry after 225.739861ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.012854 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.055958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.056036 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.056342 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.079682 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.079793 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.079817 1177669 retry.go:31] will retry after 552.403697ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.200970 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.261673 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.261728 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.261746 1177669 retry.go:31] will retry after 669.780864ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.556091 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.632797 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.699577 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.699638 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.699664 1177669 retry.go:31] will retry after 634.295794ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.931763 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.990067 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.993514 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.993545 1177669 retry.go:31] will retry after 1.113615509s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.055858 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:16.334650 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:16.392078 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:16.395777 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.395856 1177669 retry.go:31] will retry after 558.474178ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.556248 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.556629 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:16.556701 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:16.955131 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:17.055832 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.055954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.056319 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:17.076617 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.076722 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.076755 1177669 retry.go:31] will retry after 1.676176244s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.108039 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:17.223472 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.223571 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.223606 1177669 retry.go:31] will retry after 1.165701868s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.556175 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.556607 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.056383 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.056458 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.056745 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.390333 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:18.466841 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.466880 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.466899 1177669 retry.go:31] will retry after 1.475434566s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.556290 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.556363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.556640 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.753095 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:18.817795 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.817871 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.817893 1177669 retry.go:31] will retry after 1.833170296s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:19.056294 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.056363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:19.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:19.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.556536 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.556903 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:19.943440 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:20.003817 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.008032 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.008069 1177669 retry.go:31] will retry after 3.979109659s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.056345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.056668 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.556404 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.556792 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.652153 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:20.711890 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.715639 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.715672 1177669 retry.go:31] will retry after 3.637109781s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:21.056958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.057040 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.057388 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:21.057444 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:21.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.555927 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.556005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.556330 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.056025 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.056094 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.056444 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.556246 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.556345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.556676 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:23.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:23.987349 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:24.051441 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.051487 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.051524 1177669 retry.go:31] will retry after 5.3171516s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.056654 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.056732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.057111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:24.353838 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:24.413422 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.413469 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.413487 1177669 retry.go:31] will retry after 3.340127313s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.555854 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.555928 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:26.056042 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.056124 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.056522 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:26.056585 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:26.556332 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.556411 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.056517 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.056589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.056942 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.555642 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.754507 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:27.812979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:27.813026 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:27.813045 1177669 retry.go:31] will retry after 6.95951013s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:28.056456 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.056550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:28.057006 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:28.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.055872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.368874 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:29.425933 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:29.429391 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.429423 1177669 retry.go:31] will retry after 6.711424265s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.555717 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.055742 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.055823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:30.556179 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:31.055879 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.055958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.056290 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:31.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.556084 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.056199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.555959 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.556028 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.556363 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:32.556413 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:33.055772 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.055844 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.056367 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:33.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.556178 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.055882 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.055955 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.556326 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.556397 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:34.556796 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:34.773144 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:34.829958 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:34.833899 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:34.833929 1177669 retry.go:31] will retry after 8.542321591s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:35.056329 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.056407 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.056770 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:35.556516 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.556605 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.556959 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.057369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.057701 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.141963 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:36.202438 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:36.202477 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.202496 1177669 retry.go:31] will retry after 7.758270018s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:37.055818 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.055893 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:37.056274 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:37.555746 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.555822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.055812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.056254 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.556149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:39.055837 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.056297 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:39.056352 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:39.556245 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.556663 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.056509 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.056606 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.056917 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.555706 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.055770 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.056183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.555731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:41.556201 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:42.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.055854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:42.556028 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.556119 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.556476 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.056276 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.056351 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.056698 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.377156 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:43.435792 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:43.439377 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.439408 1177669 retry.go:31] will retry after 18.255208537s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.556665 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.556738 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.557098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:43.557163 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:43.961544 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:44.047619 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:44.047656 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.047681 1177669 retry.go:31] will retry after 16.124184127s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.055890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.056245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:44.556158 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.556259 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.556606 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:45.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.057068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1218 00:21:45.555737 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.555812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.556144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:46.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:46.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:46.555906 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.556364 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.055801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.555784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.056189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:48.056258 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:48.555948 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.556021 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.556370 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.056069 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.056148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.056513 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.556531 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.556886 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:50.056538 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.056619 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.056964 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:50.057018 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:50.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.556116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.055815 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.055884 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.056198 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.555734 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:52.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:53.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.056005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.056346 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:53.556049 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.056030 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.056102 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.056454 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.556467 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.556542 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.556870 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:54.556927 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:55.055618 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.055704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.056046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:55.555616 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.555993 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.055712 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:57.055721 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:57.056210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:57.555892 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.555965 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.556299 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.555793 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.555877 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.556239 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:59.556292 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:00.055898 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.055985 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.056349 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:00.172859 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:00.349113 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:00.349165 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.349188 1177669 retry.go:31] will retry after 15.178958797s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.556482 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.556554 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.056710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.057020 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.555743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.695619 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:01.764253 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:01.768251 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:01.768286 1177669 retry.go:31] will retry after 20.261734519s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:02.055637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.055714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.056058 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:02.056113 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:02.555751 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.555820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.555659 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.555732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.556022 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:04.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.056080 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:04.056144 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:04.556235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.556662 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.056450 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.056522 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.056859 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.556627 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.556731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.557039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:06.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:06.056174 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:06.555712 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.556120 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.056150 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.555911 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.555990 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.556341 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:08.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.055811 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.056155 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:08.056203 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:08.555901 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.555978 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.556327 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:10.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.055797 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:10.056287 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:10.555954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.556049 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.556390 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.555929 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.056135 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:12.556162 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:13.055804 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.056174 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:13.555873 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.555943 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.056098 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.056468 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.556454 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.556529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.556828 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:14.556880 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:15.056592 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.056660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.057019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.528571 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:15.555957 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.556023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.556331 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.591509 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:15.594869 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:15.594900 1177669 retry.go:31] will retry after 30.932709272s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:16.056512 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.056582 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.056902 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:16.555621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.555718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:17.055743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.056124 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:17.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:17.555739 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.555834 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.055987 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.056365 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.556060 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.556132 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.556480 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:19.056235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.056623 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:19.056697 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:19.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.556660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.556999 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.056621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.056700 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.057051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.555764 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.556195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.055711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:21.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:22.030818 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:22.056348 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.056446 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.056751 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:22.091069 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:22.094766 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.094798 1177669 retry.go:31] will retry after 47.715756714s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.556528 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.556883 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.056081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.555699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.555792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:24.055868 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.055942 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.056302 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:24.056357 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:24.556255 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.556349 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.056263 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.056721 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.556539 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.556623 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.557021 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.055767 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.055851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.555952 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.556027 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:26.556419 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:27.056075 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:27.556423 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.556518 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.056638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.057038 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.555732 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.555814 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.556167 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:29.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:29.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:29.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.055846 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.055931 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.056335 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.556057 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.556129 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.556500 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:31.056282 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.056362 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.056704 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:31.056761 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:31.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.556564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.556896 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.055712 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.556248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.055753 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.556054 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.556161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.556684 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:33.556740 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:34.056460 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.056532 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.056854 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:34.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.556213 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.555889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.556242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:36.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.056089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:36.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:36.555705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.055826 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.056241 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.555756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:38.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.056147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:38.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:38.555776 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.556214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.056109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.555711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.555807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.556166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:40.055954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.056034 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.056481 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:40.056546 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:40.556379 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.556469 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.556814 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.056496 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.056800 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.556572 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.556672 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.557008 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.055744 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.056248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.555835 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.555911 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.556276 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:42.556330 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:43.056000 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.056095 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.056464 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:43.556247 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.556319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.056503 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.056852 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.556012 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.556109 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:44.556512 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:45.057726 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.057809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.058234 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:45.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.556053 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.556425 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.055813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.528809 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:46.556363 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.556430 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.556863 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:46.556912 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:46.592076 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592111 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592213 1177669 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:22:47.056004 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.056462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:47.556264 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.556334 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.556652 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.056410 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.056481 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.056790 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.556515 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.556589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.556921 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:48.556975 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:49.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.055730 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:49.555733 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.555821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.556169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.055866 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.055935 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.555707 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.555815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:51.056519 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.056627 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:51.057002 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:51.555636 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.555709 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.556029 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.055820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.555872 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.555945 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.055695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.055766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.056179 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.555862 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.555934 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:53.556377 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:54.056043 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.056137 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.056494 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:54.556337 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.556432 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.556763 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.056566 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.056640 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.056965 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.555663 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.556084 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:56.056189 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:56.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.556131 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.056558 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.056624 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.056923 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.556638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.556705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.556914 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.055638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.055992 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.556608 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.556858 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:58.556906 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:59.055615 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.055693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.055962 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:59.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.055787 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.055870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.056214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.555725 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:01.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:01.056324 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:01.555994 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.556068 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.556424 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.056333 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.556470 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.556547 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.055624 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.055721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.056047 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:03.556183 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:04.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:04.556084 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.556154 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.556510 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.056318 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.056386 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.056739 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.556502 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.556575 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.556888 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:05.556938 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:06.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.056072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:06.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.555824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.556163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.056617 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.056703 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.057017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.555759 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.556245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:08.055620 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.055732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:08.056120 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:08.555828 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.555904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.055992 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.056064 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.056490 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.556279 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.811086 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:23:09.870262 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873844 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873941 1177669 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:23:09.877215 1177669 out.go:179] * Enabled addons: 
	I1218 00:23:09.880843 1177669 addons.go:530] duration metric: took 1m55.558566134s for enable addons: enabled=[]
	I1218 00:23:10.056212 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.056346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.056713 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:10.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:10.556554 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.556967 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.055656 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.055785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.555809 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.555880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.556212 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.055838 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.556050 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:12.556108 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:13.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.055825 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.056171 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:13.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.556080 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.556462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.055821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.056182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.556315 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.556385 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:14.556741 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:15.056401 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.056470 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.056793 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:15.556411 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.556480 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.556780 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.056647 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.056963 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:17.055850 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.055925 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.056282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:17.056334 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:17.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.556122 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.556497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.056286 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.056361 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.056685 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.556408 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.556802 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:19.056570 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.056646 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.057040 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:19.057095 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:19.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.555860 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.555958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.556317 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.056081 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.056416 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.555830 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.555903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:21.556323 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:22.056015 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.056091 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.056432 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:22.556188 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.556285 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.556619 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.056387 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.056459 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.056805 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.556577 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.556991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:23.557043 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:24.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:24.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.556090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.556429 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.056237 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.056319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.056651 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.556407 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.556484 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.556804 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:26.056601 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.056678 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.057039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:26.057096 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:26.556349 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.556417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.556670 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.056529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.555635 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.555714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.556073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.556018 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.556350 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:28.556398 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:29.056066 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.056141 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.056558 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:29.556410 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.556482 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.556819 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.056192 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.056697 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.556347 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.556425 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.556813 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:30.556882 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:31.056651 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.056724 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.057110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:31.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:33.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:33.056171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:33.556520 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.556626 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.557595 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.055711 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.555832 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.555907 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.556266 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:35.055820 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.056262 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:35.056317 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:35.555980 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.556055 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.556475 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.056253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.056329 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.056689 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.556253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.556322 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.556585 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:37.056343 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.056417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.056777 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:37.056832 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:37.556557 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.556628 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.055768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.055671 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.056076 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.555915 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.555994 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:39.556347 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:40.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.056458 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:40.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.556320 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.556648 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.056467 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.056544 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.556710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:41.557023 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:42.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:42.555713 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.055805 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.055880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.056188 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:44.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.055912 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.056273 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:44.056335 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:44.555920 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.555993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.556368 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.058236 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.058319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.058728 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.556602 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.556934 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.055727 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.056067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.556151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:46.556209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:47.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.056162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:47.555674 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.055810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.556441 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.556514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.556832 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:48.556892 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:49.056604 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.056674 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.057002 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:49.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.555771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.055675 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:51.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:51.056177 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:51.555865 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.555937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.556310 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.555777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:53.055816 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.055886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.056231 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:53.056281 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:53.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.556010 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.556362 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.556026 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.556097 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.556417 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.055678 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.056101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.555734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:55.556129 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:56.055730 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:56.555867 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.555946 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.556300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.056090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.056457 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.556250 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.556323 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.556654 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:57.556712 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:58.056487 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:58.555623 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.055906 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.055982 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.056358 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.555710 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.555803 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:00.059640 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.059720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.060067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:00.060115 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:00.556240 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.556671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.056422 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.056490 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.056823 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.556575 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.556648 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.556984 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.056108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.555781 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:02.556145 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:03.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:03.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.555789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.056105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.556112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:04.556502 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:05.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.056331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.056671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:05.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.556557 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.055645 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.055718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.056059 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.555753 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.556081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:07.056275 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.056343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.056625 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:07.056668 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:07.556435 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.556511 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.055588 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.055660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.056003 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.055807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.055881 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.056246 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.555809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:09.556165 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:10.055865 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.055961 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.056303 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:10.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.556089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.055834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.056300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.556519 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:11.556574 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:12.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.056402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.056727 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:12.556537 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.556620 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.556973 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.055664 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.555826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.556183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:14.055916 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.055993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.056372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:14.056425 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:14.556306 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.556384 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.556722 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.056477 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.056546 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.056868 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.556714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.557060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.556138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:16.556191 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:17.055685 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.056060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:17.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.056016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.555699 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:19.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:19.056164 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:19.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.055623 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.055701 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.056014 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.555727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.055681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:21.556151 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:22.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.056666 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:22.556365 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.556434 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.556767 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.056542 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.056618 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.056908 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.555630 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.556032 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:24.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:24.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:24.556065 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.556148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.556518 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.056084 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.056512 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.556289 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.556691 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:26.056500 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.056600 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.056966 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:26.057019 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:26.555679 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.556107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.055812 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.556038 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.556103 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:28.556166 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:29.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.555778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.555730 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:30.556245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:31.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:31.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.055797 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.555678 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:33.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:33.056107 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:33.555768 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.556187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.055758 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.055833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.556178 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:35.056326 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.056396 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.056747 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:35.056801 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:35.556586 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.556657 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.557044 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.055782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.555908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.556282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:37.056627 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.057073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:37.057140 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:37.555781 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.555850 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.055869 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.055947 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.056284 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.055844 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.055924 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.056291 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:39.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:40.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:40.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.056055 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.555718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.556072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:42.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:42.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:42.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.556016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.556296 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.556369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:44.556718 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:45.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.057363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.057788 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:45.556579 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.556652 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.556974 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.555790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:47.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.055779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.056086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:47.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:47.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.555807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.555758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.556101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:49.556155 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:50.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.056148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:50.555821 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.555892 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.556250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.056032 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.056394 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:51.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:52.055852 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.056308 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:52.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.556071 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.556442 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.056207 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.056295 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.056620 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.556372 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.556448 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.556772 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:53.556829 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:54.056587 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.056664 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.057045 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:54.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.556382 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.056125 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:56.055825 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.055904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.056298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:56.056356 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:56.556003 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.556076 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.056092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.555906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.556260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:58.556314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:59.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.055748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:59.555874 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.555954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.055976 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.056062 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.056441 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.556138 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.556250 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.556688 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:00.556759 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:01.056530 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.056604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.056936 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:01.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.555731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.056275 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.555950 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.556022 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.556393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:03.056088 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.056161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.056527 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:03.056581 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:03.556314 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.556377 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.556644 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.056325 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.056398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.056741 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.556656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.556735 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.557093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.055643 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.556394 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.556464 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.556836 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:05.556889 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:06.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.057085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:06.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.056100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.556130 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:08.055819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.056255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:08.056314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:08.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.556052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.556389 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.556157 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.556549 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:10.056352 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.056426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.056786 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:10.056843 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:10.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.556658 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.557006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.055714 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.555890 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.555963 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.556324 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.056036 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.056112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.056497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.556273 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.556347 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:12.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:13.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.056492 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.056825 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:13.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.556340 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.556615 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.055978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.056067 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.555796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.556186 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:15.055759 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.055835 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.056169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:15.056244 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:15.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.556434 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.056196 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.056293 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.056642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.556550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.556891 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.055661 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.056001 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.556096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:17.556152 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:18.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:18.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.055840 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.055916 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.555697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:19.556192 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:20.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:20.555796 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.555870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.556255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.055952 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.056023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.056395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.556064 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.556138 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.556504 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:21.556557 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:22.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.056350 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.056700 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:22.556495 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.556573 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.556915 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.055663 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.055991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.555759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:24.055734 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.055816 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.056168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:24.056239 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:24.556073 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.556516 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:26.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.055868 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.056210 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:26.056286 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:26.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.556109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.055906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.056250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.555667 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.556156 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:28.556210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:29.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:29.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.056614 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.555633 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.555715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:31.055809 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.056307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:31.056368 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:31.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.055756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.555778 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.555851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:33.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.056351 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:33.056404 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:33.556040 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.556451 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.056004 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.056660 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.556087 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.555845 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.556288 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:35.556349 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:36.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:36.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.555888 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:38.055607 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.055689 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.056039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:38.056098 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:38.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.055853 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.056286 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.556210 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.556639 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:40.056445 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.056865 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:40.056930 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:40.556620 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.556695 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.557017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.555789 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.555863 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.556189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.056163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.555934 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.556328 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:42.556374 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:43.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:43.555816 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.556292 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.055998 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.056073 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.556595 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.556924 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:44.556979 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:45.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.056201 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:45.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.056492 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.056876 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.556642 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.556716 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.557036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:46.557089 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:47.055763 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.055839 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:47.555908 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.555986 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.556307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.055720 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.556093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:49.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.056114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:49.056169 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:49.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.055833 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.055910 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.056293 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.555974 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.556043 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:51.056056 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.056128 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.056465 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:51.056513 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:51.556270 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.556344 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.556681 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.056466 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.056539 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.056895 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.555612 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.555693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.556206 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.055909 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.055981 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:53.556175 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:54.055792 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.056260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:54.556080 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.556156 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.556472 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.555651 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.556079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:56.056196 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:56.555837 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.555913 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.056033 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.056356 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.056107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.555780 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.555854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.556190 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:58.556260 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:59.055926 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.056011 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:59.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.556343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.556642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.059082 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.059161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.059514 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.556566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.556913 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:00.556965 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:01.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.055719 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:01.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.555754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:03.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.056098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:03.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:03.555672 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.056115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.556167 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.556265 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.556617 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:05.056442 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.056514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:05.056907 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:05.556597 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.556667 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.556997 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.556092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.055659 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.055728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.056019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:07.556123 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:08.055665 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.055743 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.056061 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:08.555631 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.555705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.055707 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.055787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.555670 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.556065 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:10.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.055794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:10.056268 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:10.555749 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.555819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.556146 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.055855 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:12.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:13.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:13.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.555886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.556252 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.055957 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.056026 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.056385 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.556596 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.556938 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:14.556992 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:15.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:15.555805 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.555887 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.055807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.555813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:17.055842 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.055915 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.056281 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:17.056342 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:17.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.555824 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.555898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.556258 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.055724 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.555983 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.556056 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.556402 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:19.556458 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:20.056180 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.056281 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.056653 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:20.556480 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.556560 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.056634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.056717 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.057043 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:22.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.055734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.056082 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:22.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:22.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.556259 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.055950 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.056030 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.056393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.555649 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.556074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.055763 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.556096 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.556167 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.556536 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:24.556589 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:25.056122 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.056197 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.056567 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:25.556331 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.556402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.556737 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.056615 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.056954 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.555650 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:27.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:27.056160 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:27.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.555829 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.055860 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.055937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.556069 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.556395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:29.056209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:29.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.055732 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.055808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.056154 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.555757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:31.055778 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.055856 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:31.056278 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:31.555669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.555744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.555701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:33.055858 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.056301 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:33.056353 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:33.556038 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.556445 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.056214 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.056311 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.056650 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.556604 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.556677 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.557012 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:35.056645 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.056718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.057052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:35.057102 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:35.555605 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.555680 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.556018 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.055752 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.055826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.056172 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.556126 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.055668 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.056096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.555797 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.555867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.556203 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:37.556272 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:38.055962 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.056083 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.056495 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:38.556271 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.556346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.556695 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.055905 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.055976 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.056296 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.556206 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:39.556792 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:40.056703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.056787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.057218 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:40.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.555750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:42.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.055860 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:42.056301 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:42.555953 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.556025 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.556420 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.555853 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.556039 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.556118 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:44.556541 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:45.055766 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.055855 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:45.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.555793 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:47.055949 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.056052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.056438 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:47.056488 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:47.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.556324 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.556696 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.056448 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.056853 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.556536 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.556604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.556871 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:49.056619 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.056692 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.057011 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:49.057072 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:49.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.556115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.055723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.555726 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.555805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:51.556259 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:52.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.056215 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:52.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.056106 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.555801 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.556202 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:53.556285 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:54.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.056407 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:54.556321 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.556398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.557023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.055680 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.555795 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.555866 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.556181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:56.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.055805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.056166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:56.056245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:56.555703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.555700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.556119 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:58.556172 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:59.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.055903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:59.555918 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.555995 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.056095 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.056184 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.056542 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.556340 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.556426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.556768 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:00.556820 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:01.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.056617 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.056937 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:01.555665 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.056048 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.056120 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.056471 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.556307 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.556375 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:03.056489 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.056566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.056907 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:03.056959 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:03.556548 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.556616 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.556947 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.055614 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.055691 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.056023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.556067 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.556168 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.055755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.056077 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.556113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:05.556171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:06.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.056074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:06.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.555767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.055795 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.555673 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.556083 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:08.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.055846 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.056205 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:08.056297 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:08.555965 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.556379 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.055815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.056111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.556064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:10.556121 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:11.055789 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:11.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.055985 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.555638 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.555713 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.556042 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:13.055626 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.055698 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.056031 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:13.056081 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:13.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.556147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.055908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:14.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.555886 1177669 node_ready.go:38] duration metric: took 6m0.000394955s for node "functional-240845" to be "Ready" ...
	I1218 00:27:14.559015 1177669 out.go:203] 
	W1218 00:27:14.562031 1177669 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:27:14.562056 1177669 out.go:285] * 
	W1218 00:27:14.564187 1177669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:27:14.567133 1177669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:26:34 functional-240845 crio[2318]: time="2025-12-18T00:26:34.986396852Z" level=info msg="Started container" PID=3029 containerID=3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0 description=kube-system/storage-provisioner/storage-provisioner id=92da38d7-d7a2-498b-be1f-6a608d2863bd name=/runtime.v1.RuntimeService/StartContainer sandboxID=552e688f4b2fbc14144d1338b2c4dcb19f6ce8e6a4c97e972802d45e8f302aae
	Dec 18 00:26:35 functional-240845 conmon[3027]: conmon 3051bfe26a7bd174b56e <ninfo>: container 3029 exited with status 1
	Dec 18 00:26:35 functional-240845 crio[2318]: time="2025-12-18T00:26:35.375814642Z" level=info msg="Removing container: d33eff34065a8845822e47f7213828946feb3b91cb5a51c36cdfef4948a3702b" id=d73da053-cad3-418a-9eeb-f7ed8c9f2276 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:26:35 functional-240845 crio[2318]: time="2025-12-18T00:26:35.383466329Z" level=info msg="Error loading conmon cgroup of container d33eff34065a8845822e47f7213828946feb3b91cb5a51c36cdfef4948a3702b: cgroup deleted" id=d73da053-cad3-418a-9eeb-f7ed8c9f2276 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:26:35 functional-240845 crio[2318]: time="2025-12-18T00:26:35.386625146Z" level=info msg="Removed container d33eff34065a8845822e47f7213828946feb3b91cb5a51c36cdfef4948a3702b: kube-system/storage-provisioner/storage-provisioner" id=d73da053-cad3-418a-9eeb-f7ed8c9f2276 name=/runtime.v1.RuntimeService/RemoveContainer
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.961463725Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=240ed625-f345-4390-995a-2a00ffe5d533 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.962777924Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=83b0019c-125f-488a-b206-d72df501b471 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.963805181Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=00f1b8b0-efe0-4fa9-a417-a72092c89009 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.964001589Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:26:38 functional-240845 crio[2318]: time="2025-12-18T00:26:38.968079081Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=00f1b8b0-efe0-4fa9-a417-a72092c89009 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.960919479Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=8553d3bc-767f-41e0-83b6-410ac05d4c37 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.961885044Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=81af4700-b21e-478b-99ab-34aa44b35818 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.962802996Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=8bcfd0f5-e059-469e-b064-091f9cedf10c name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.962900684Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:26:49 functional-240845 crio[2318]: time="2025-12-18T00:26:49.967174173Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=8bcfd0f5-e059-469e-b064-091f9cedf10c name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.963021789Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=2b4ec4ab-8335-4e22-a3cd-c579780fa2ca name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.964640183Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=87bae2c1-8fd4-4ec1-9196-cae09ea601e4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.965602499Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=8678d9a2-1c51-4d52-b1a2-f3774e105249 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.965701959Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:27:00 functional-240845 crio[2318]: time="2025-12-18T00:27:00.96965778Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=8678d9a2-1c51-4d52-b1a2-f3774e105249 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.964208217Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=aa0e3d92-0554-4021-a6d9-7af588b78771 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.966641043Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=c61b476e-8cdc-4380-b5f7-22466ba406b3 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.9686077Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=04ca7006-7f30-4412-ad63-4fd0c4347029 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.96872005Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:27:12 functional-240845 crio[2318]: time="2025-12-18T00:27:12.972930444Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=04ca7006-7f30-4412-ad63-4fd0c4347029 name=/runtime.v1.RuntimeService/CreateContainer
	
	
	==> container status <==
	CONTAINER           IMAGE                                                              CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	3051bfe26a7bd       ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6   44 seconds ago      Exited              storage-provisioner       6                   552e688f4b2fb       storage-provisioner                         kube-system
	56af7390805be       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22   2 minutes ago       Exited              kube-controller-manager   5                   d9cddccbc36e9       kube-controller-manager-functional-240845   kube-system
	3df4b23cd1fc9       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   5 minutes ago       Running             kube-proxy                2                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	9b3fcd7bdcddc       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   5 minutes ago       Running             kindnet-cni               2                   2557f167a47ed       kindnet-84qbm                               kube-system
	fb962917a931f       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   6 minutes ago       Running             etcd                      2                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	f6d062f0f43f4       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   6 minutes ago       Running             kube-scheduler            2                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	45ca9ca01a676       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   6 minutes ago       Running             coredns                   2                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	e79c8e6ec8375       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   7 minutes ago       Exited              kube-proxy                1                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	0fe4c80fa2adf       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   7 minutes ago       Exited              kindnet-cni               1                   2557f167a47ed       kindnet-84qbm                               kube-system
	95d915f37e740       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   7 minutes ago       Exited              etcd                      1                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	9caeb1dccc679       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   7 minutes ago       Exited              kube-scheduler            1                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	cf507cc725a8d       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   7 minutes ago       Exited              coredns                   1                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	2b9f193a1520d       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896   8 minutes ago       Exited              kube-apiserver            0                   e04fd252da213       kube-apiserver-functional-240845            kube-system
	
	
	==> coredns [45ca9ca01a676570a0535560af08d4e95f72145d9702ec8b798ce70d833c0356] <==
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	
	
	==> coredns [cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15] <==
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] plugin/health: Going into lameduck mode for 5s
	[INFO] 127.0.0.1:42709 - 40478 "HINFO IN 7841480554586397634.8984575394038029725. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043241905s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e] <==
	{"level":"info","ts":"2025-12-18T00:19:42.456949Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.459892Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.464260Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:19:42.464361Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:19:42.473581Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:19:42.473686Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.512923Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.860469Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-12-18T00:19:42.860559Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-12-18T00:19:42.860705Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.862987Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.863085Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.863106Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"info","ts":"2025-12-18T00:19:42.863197Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-12-18T00:19:42.863210Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863328Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863350Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863361Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863401Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863409Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863417Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.876941Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-12-18T00:19:42.877021Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.877059Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:19:42.877083Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [fb962917a931fb777a305b1b6998e379972e4d38499641f5d582e94ff93708b1] <==
	{"level":"info","ts":"2025-12-18T00:21:16.766111Z","caller":"fileutil/purge.go:49","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2025-12-18T00:21:16.766332Z","caller":"embed/etcd.go:640","msg":"serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.766371Z","caller":"embed/etcd.go:611","msg":"cmux::serve","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.767189Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1981","msg":"aec36adc501070cc switched to configuration voters=(12593026477526642892)"}
	{"level":"info","ts":"2025-12-18T00:21:16.767293Z","caller":"membership/cluster.go:433","msg":"ignore already added member","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","added-peer-id":"aec36adc501070cc","added-peer-peer-urls":["https://192.168.49.2:2380"],"added-peer-is-learner":false}
	{"level":"info","ts":"2025-12-18T00:21:16.767389Z","caller":"membership/cluster.go:674","msg":"updated cluster version","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","from":"3.6","to":"3.6"}
	{"level":"info","ts":"2025-12-18T00:21:17.452278Z","logger":"raft","caller":"v3@v3.6.0/raft.go:988","msg":"aec36adc501070cc is starting a new election at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452423Z","logger":"raft","caller":"v3@v3.6.0/raft.go:930","msg":"aec36adc501070cc became pre-candidate at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452498Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgPreVoteResp from aec36adc501070cc at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452538Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.452580Z","logger":"raft","caller":"v3@v3.6.0/raft.go:912","msg":"aec36adc501070cc became candidate at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458217Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgVoteResp from aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458311Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.458356Z","logger":"raft","caller":"v3@v3.6.0/raft.go:970","msg":"aec36adc501070cc became leader at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458401Z","logger":"raft","caller":"v3@v3.6.0/node.go:370","msg":"raft.node: aec36adc501070cc elected leader aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.464392Z","caller":"etcdserver/server.go:1820","msg":"published local member to cluster through raft","local-member-id":"aec36adc501070cc","local-member-attributes":"{Name:functional-240845 ClientURLs:[https://192.168.49.2:2379]}","cluster-id":"fa54960ea34d58be","publish-timeout":"7s"}
	{"level":"info","ts":"2025-12-18T00:21:17.464576Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.465463Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.467639Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:21:17.469663Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.484256Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:21:17.484501Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:21:17.485336Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:21:17.490008Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.492585Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	
	
	==> kernel <==
	 00:27:19 up  7:09,  0 user,  load average: 0.09, 0.40, 1.11
	Linux functional-240845 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c] <==
	I1218 00:19:41.706085       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 00:19:41.706608       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1218 00:19:41.706796       1 main.go:148] setting mtu 1500 for CNI 
	I1218 00:19:41.706849       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 00:19:41.706886       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T00:19:41Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	E1218 00:19:41.884497       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	I1218 00:19:41.884891       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 00:19:41.884912       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 00:19:41.884921       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 00:19:41.885210       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1218 00:19:41.885327       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:19:41.885420       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:19:41.885714       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:19:42.810791       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	
	
	==> kindnet [9b3fcd7bdcddc7326e7d4c50ecf0ebeef85e8ebe52719009cafb599db42b74a4] <==
	E1218 00:22:39.976452       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:23:02.096720       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:23:15.308322       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:23:24.321432       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:23:30.188055       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:23:40.934269       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:23:50.555354       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:11.192172       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:24:18.295989       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:20.463220       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:24:45.334437       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:50.163434       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:56.363905       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:14.032706       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:25:16.121579       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:25:27.942524       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:48.335255       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:25:56.102936       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:05.238670       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:26:10.015965       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:26:40.630863       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:43.534436       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:26:50.773423       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:27:04.308560       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:27:16.542428       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	
	
	==> kube-apiserver [2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de] <==
	W1218 00:19:35.450481       1 logging.go:55] [core] [Channel #135 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450516       1 logging.go:55] [core] [Channel #2 SubChannel #6]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450551       1 logging.go:55] [core] [Channel #7 SubChannel #9]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450585       1 logging.go:55] [core] [Channel #26 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450620       1 logging.go:55] [core] [Channel #231 SubChannel #233]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451088       1 logging.go:55] [core] [Channel #131 SubChannel #133]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451140       1 logging.go:55] [core] [Channel #255 SubChannel #257]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451176       1 logging.go:55] [core] [Channel #163 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451218       1 logging.go:55] [core] [Channel #187 SubChannel #189]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	I1218 00:19:35.461000       1 controller.go:128] Shutting down kubernetes service endpoint reconciler
	W1218 00:19:35.463755       1 logging.go:55] [core] [Channel #139 SubChannel #141]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463898       1 logging.go:55] [core] [Channel #175 SubChannel #177]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463993       1 logging.go:55] [core] [Channel #235 SubChannel #237]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464077       1 logging.go:55] [core] [Channel #143 SubChannel #145]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464158       1 logging.go:55] [core] [Channel #199 SubChannel #201]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464191       1 logging.go:55] [core] [Channel #55 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464387       1 logging.go:55] [core] [Channel #91 SubChannel #93]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464473       1 logging.go:55] [core] [Channel #103 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464539       1 logging.go:55] [core] [Channel #223 SubChannel #225]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464601       1 logging.go:55] [core] [Channel #107 SubChannel #109]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464323       1 logging.go:55] [core] [Channel #155 SubChannel #157]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464353       1 logging.go:55] [core] [Channel #99 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464694       1 logging.go:55] [core] [Channel #227 SubChannel #229]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464564       1 logging.go:55] [core] [Channel #247 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90] <==
	I1218 00:24:41.593979       1 serving.go:386] Generated self-signed cert in-memory
	I1218 00:24:44.118744       1 controllermanager.go:191] "Starting" version="v1.34.3"
	I1218 00:24:44.118773       1 controllermanager.go:193] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 00:24:44.120180       1 dynamic_cafile_content.go:161] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I1218 00:24:44.120356       1 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1218 00:24:44.120599       1 secure_serving.go:211] Serving securely on 127.0.0.1:10257
	I1218 00:24:44.120986       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E1218 00:24:54.122993       1 controllermanager.go:245] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.49.2:8441/healthz\": dial tcp 192.168.49.2:8441: connect: connection refused"
	
	
	==> kube-proxy [3df4b23cd1fc91cb6876fab74b357bb139f1ea48223b502c7dd9c80ea84c8387] <==
	I1218 00:21:20.402926       1 server_linux.go:53] "Using iptables proxy"
	I1218 00:21:20.489835       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1218 00:21:20.490690       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:21.505281       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:24.355725       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:29.225897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:37.736765       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:52.064415       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:22:27.480420       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:23:19.153551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:24:11.289733       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:05.391565       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:37.545166       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:16.451554       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:57.580830       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	
	
	==> kube-proxy [e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563] <==
	
	
	==> kube-scheduler [9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1] <==
	I1218 00:19:42.975868       1 serving.go:386] Generated self-signed cert in-memory
	W1218 00:19:43.469835       1 authentication.go:397] Error looking up in-cluster authentication configuration: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": dial tcp 192.168.49.2:8441: connect: connection refused
	W1218 00:19:43.469867       1 authentication.go:398] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1218 00:19:43.469874       1 authentication.go:399] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1218 00:19:43.478778       1 server.go:175] "Starting Kubernetes Scheduler" version="v1.34.3"
	I1218 00:19:43.478807       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	E1218 00:19:43.478827       1 event.go:401] "Unable start event watcher (will not retry!)" err="broadcaster already stopped"
	I1218 00:19:43.480968       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481035       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481365       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	E1218 00:19:43.481433       1 server.go:286] "handlers are not fully synchronized" err="context canceled"
	E1218 00:19:43.481498       1 shared_informer.go:352] "Unable to sync caches" logger="UnhandledError" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481515       1 configmap_cafile_content.go:213] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481533       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 00:19:43.481546       1 tlsconfig.go:258] "Shutting down DynamicServingCertificateController"
	I1218 00:19:43.481689       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1218 00:19:43.481706       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1218 00:19:43.481710       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1218 00:19:43.481721       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [f6d062f0f43f4922799fb3880d16e341783d4d7d586d7db4a50fb1085ef76e6e] <==
	E1218 00:26:19.142467       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:26:19.248580       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:26:22.795585       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:26:23.239244       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:26:27.453246       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:28.064817       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 00:26:28.967293       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:26:33.949400       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 00:26:35.154591       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:26:40.179897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:26:42.511252       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1218 00:26:44.107019       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:26:46.364045       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1218 00:26:51.526793       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 00:26:53.205541       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:26:53.282506       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1218 00:26:56.102986       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: Get \"https://192.168.49.2:8441/api/v1/replicationcontrollers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 00:26:58.988588       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:27:00.599087       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 00:27:06.721194       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:27:09.382892       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:27:10.196478       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:27:11.331530       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:27:17.829842       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:27:20.022138       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	
	
	==> kubelet <==
	Dec 18 00:27:10 functional-240845 kubelet[1315]: E1218 00:27:10.595150    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:11 functional-240845 kubelet[1315]: I1218 00:27:11.960954    1315 scope.go:117] "RemoveContainer" containerID="3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0"
	Dec 18 00:27:11 functional-240845 kubelet[1315]: E1218 00:27:11.961545    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(36dc300a-a099-40d7-874e-e5c2b3795445)\"" pod="kube-system/storage-provisioner" podUID="36dc300a-a099-40d7-874e-e5c2b3795445"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.055596    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a\\\",\\\"docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1\\\",\\\"docker.io/kinde
st/kindnetd:v20250512-df8de77b\\\"],\\\"sizeBytes\\\":111333938},{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae\\\",\\\"docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3\\\",\\\"docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88\\\"],\\\"sizeBytes\\\":108362109},{\\\"names\\\":[\\\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\\\",\\\"registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5\\\",\\\"registry.k8s.io/kube-apiserver:v1.34.3\\\"],\\\"sizeBytes\\\":84818927},{\\\"names\\\":[\\\"registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86\\\",\\\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\\\",\\\"registry.k8s.io/kube-proxy:v1.34.3\\\"],\\\"sizeBytes\\\":75941783},{\\\"names
\\\":[\\\"registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789\\\",\\\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\\\",\\\"registry.k8s.io/coredns/coredns:v1.12.1\\\"],\\\"sizeBytes\\\":73195387},{\\\"names\\\":[\\\"registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1\\\",\\\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\\\",\\\"registry.k8s.io/kube-controller-manager:v1.34.3\\\"],\\\"sizeBytes\\\":72629077},{\\\"names\\\":[\\\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\\\",\\\"registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e\\\",\\\"registry.k8s.io/etcd:3.6.5-0\\\"],\\\"sizeBytes\\\":60857170},{\\\"names\\\":[\\\"registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf
0ebc01098ef92965cc371eabcb9611\\\",\\\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\\\",\\\"registry.k8s.io/kube-scheduler:v1.34.3\\\"],\\\"sizeBytes\\\":51592021},{\\\"names\\\":[\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2\\\",\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944\\\",\\\"gcr.io/k8s-minikube/storage-provisioner:v5\\\"],\\\"sizeBytes\\\":29037500},{\\\"names\\\":[\\\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\\\",\\\"registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f\\\",\\\"registry.k8s.io/pause:3.10.1\\\"],\\\"sizeBytes\\\":519884}]}}\" for node \"functional-240845\": Patch \"https://192.168.49.2:8441/api/v1/nodes/functional-240845/status?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.056192    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.056534    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.056816    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.057023    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.057048    1315 kubelet_node_status.go:473] "Unable to update node status" err="update node status exceeds retry count"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: I1218 00:27:12.961104    1315 scope.go:117] "RemoveContainer" containerID="2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.961521    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kindnet-84qbm\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="046ced09-dec4-43cb-848e-b84560229897" pod="kube-system/kindnet-84qbm"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.966149    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/storage-provisioner\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="36dc300a-a099-40d7-874e-e5c2b3795445" pod="kube-system/storage-provisioner"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.973314    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/etcd-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="9257aaeefd3fa4168607b7fbbc0bc32d" pod="kube-system/etcd-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.973863    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-apiserver-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="deb3e5bf338d69244d476364f7618b54" pod="kube-system/kube-apiserver-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.974191    1315 log.go:32] "CreateContainer in sandbox from runtime service failed" err="rpc error: code = Unknown desc = the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" podSandboxID="e04fd252da21318ab96dfa8b10e5404c17e6ae263ccbb9e9f922d43a78607f1a"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.974364    1315 kuberuntime_manager.go:1449] "Unhandled Error" err="container kube-apiserver start failed in pod kube-apiserver-functional-240845_kube-system(deb3e5bf338d69244d476364f7618b54): CreateContainerError: the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" logger="UnhandledError"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.974485    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CreateContainerError: \"the container name \\\"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\\\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use\"" pod="kube-system/kube-apiserver-functional-240845" podUID="deb3e5bf338d69244d476364f7618b54"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.975690    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-controller-manager-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="6aa5c667ab761331e5a16029bab33485" pod="kube-system/kube-controller-manager-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.976005    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-scheduler-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="8e5e0ee0f3cd0bbcd38493dce832a8ff" pod="kube-system/kube-scheduler-functional-240845"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.976667    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-proxy-kr6r5\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="86ad3ff0-4da0-4019-8dc4-c0b794c26b01" pod="kube-system/kube-proxy-kr6r5"
	Dec 18 00:27:12 functional-240845 kubelet[1315]: E1218 00:27:12.976971    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-mrclk\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="39971787-690f-4cc8-814a-be70de00c6a9" pod="kube-system/coredns-66bc5c9577-mrclk"
	Dec 18 00:27:15 functional-240845 kubelet[1315]: I1218 00:27:15.961084    1315 scope.go:117] "RemoveContainer" containerID="56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90"
	Dec 18 00:27:15 functional-240845 kubelet[1315]: E1218 00:27:15.961278    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=kube-controller-manager-functional-240845_kube-system(6aa5c667ab761331e5a16029bab33485)\"" pod="kube-system/kube-controller-manager-functional-240845" podUID="6aa5c667ab761331e5a16029bab33485"
	Dec 18 00:27:17 functional-240845 kubelet[1315]: E1218 00:27:17.258750    1315 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/events/kube-scheduler-functional-240845.18822743ba5c43bb\": dial tcp 192.168.49.2:8441: connect: connection refused" event="&Event{ObjectMeta:{kube-scheduler-functional-240845.18822743ba5c43bb  kube-system    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-scheduler-functional-240845,UID:8e5e0ee0f3cd0bbcd38493dce832a8ff,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://127.0.0.1:10259/readyz\": dial tcp 127.0.0.1:10259: connect: connection refused,Source:EventSource{Component:kubelet,Host:functional-240845,},FirstTimestamp:2025-12-18 00:19:35.725556667 +0000 UTC m=+22.878905798,LastTimestamp:2025-12-18 00:19:36.72607
7626 +0000 UTC m=+23.879426766,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:functional-240845,}"
	Dec 18 00:27:17 functional-240845 kubelet[1315]: E1218 00:27:17.596063    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	
	
	==> storage-provisioner [3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0] <==
	I1218 00:26:34.997517       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1218 00:26:34.998962       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845: exit status 2 (361.953589ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-240845" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctional/serial/KubectlGetPods (3.00s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (3.01s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 kubectl -- --context functional-240845 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 kubectl -- --context functional-240845 get pods: exit status 1 (118.643368ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-240845 kubectl -- --context functional-240845 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctional/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctional/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-240845
helpers_test.go:244: (dbg) docker inspect functional-240845:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	        "Created": "2025-12-18T00:18:49.336039923Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1175534,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:18:49.397861382Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hostname",
	        "HostsPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hosts",
	        "LogPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2-json.log",
	        "Name": "/functional-240845",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-240845:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-240845",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	                "LowerDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/merged",
	                "UpperDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/diff",
	                "WorkDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-240845",
	                "Source": "/var/lib/docker/volumes/functional-240845/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-240845",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-240845",
	                "name.minikube.sigs.k8s.io": "functional-240845",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "80ff640c2f3e079a9c83df8e9e88ea18985e04567ee70a1bf3deb87b69d7a9ef",
	            "SandboxKey": "/var/run/docker/netns/80ff640c2f3e",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33920"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33921"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33924"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33922"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33923"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-240845": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "f2:33:56:5f:da:77",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3f9ded1bec62ca4e0acc6643285f4a8aef2088de15bf9d1e6dbf478246c82ae7",
	                    "EndpointID": "a267c79a59d712dbf268b4db11b833499096e030f2777b578bf84c7f9519c961",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-240845",
	                        "5d3e3e2a238b"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845: exit status 2 (309.25497ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctional/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctional/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 logs -n 25: (1.642729061s)
helpers_test.go:261: TestFunctional/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                     ARGS                                                      │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                            │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                            │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                            │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                               │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                               │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                               │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ delete  │ -p nospam-499800                                                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ start   │ -p functional-240845 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:19 UTC │
	│ start   │ -p functional-240845 --alsologtostderr -v=8                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:19 UTC │                     │
	│ cache   │ functional-240845 cache add registry.k8s.io/pause:3.1                                                         │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache add registry.k8s.io/pause:3.3                                                         │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache add registry.k8s.io/pause:latest                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache add minikube-local-cache-test:functional-240845                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache delete minikube-local-cache-test:functional-240845                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                              │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ list                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl images                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl rmi registry.k8s.io/pause:latest                                            │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │                     │
	│ cache   │ functional-240845 cache reload                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                              │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                           │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ kubectl │ functional-240845 kubectl -- --context functional-240845 get pods                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:19:34
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:19:34.105121 1177669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:19:34.105346 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105377 1177669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:19:34.105397 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105673 1177669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:19:34.106120 1177669 out.go:368] Setting JSON to false
	I1218 00:19:34.107069 1177669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25322,"bootTime":1765991852,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:19:34.107165 1177669 start.go:143] virtualization:  
	I1218 00:19:34.110567 1177669 out.go:179] * [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:19:34.114275 1177669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:19:34.114378 1177669 notify.go:221] Checking for updates...
	I1218 00:19:34.120029 1177669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:19:34.122925 1177669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:19:34.125751 1177669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:19:34.128638 1177669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:19:34.131461 1177669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:19:34.134887 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:34.134985 1177669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:19:34.159427 1177669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:19:34.159542 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.223972 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.214884618 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.224090 1177669 docker.go:319] overlay module found
	I1218 00:19:34.227220 1177669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:19:34.229963 1177669 start.go:309] selected driver: docker
	I1218 00:19:34.229985 1177669 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false D
isableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.230103 1177669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:19:34.230199 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.285040 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.2764408 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.285449 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:19:34.285507 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:19:34.285561 1177669 start.go:353] cluster config:
	{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetC
lientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.290381 1177669 out.go:179] * Starting "functional-240845" primary control-plane node in "functional-240845" cluster
	I1218 00:19:34.293210 1177669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:19:34.297960 1177669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:19:34.300783 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:19:34.300829 1177669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:19:34.300855 1177669 cache.go:65] Caching tarball of preloaded images
	I1218 00:19:34.300881 1177669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:19:34.300940 1177669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:19:34.300950 1177669 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:19:34.301056 1177669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/config.json ...
	I1218 00:19:34.320164 1177669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:19:34.320186 1177669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:19:34.320203 1177669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:19:34.320279 1177669 start.go:360] acquireMachinesLock for functional-240845: {Name:mk3ed718f4cde9dd7b19ef8d5bcd86c3175b5067 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:19:34.320350 1177669 start.go:364] duration metric: took 45.89µs to acquireMachinesLock for "functional-240845"
	I1218 00:19:34.320375 1177669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:19:34.320383 1177669 fix.go:54] fixHost starting: 
	I1218 00:19:34.320643 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:19:34.337200 1177669 fix.go:112] recreateIfNeeded on functional-240845: state=Running err=<nil>
	W1218 00:19:34.337231 1177669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:19:34.340534 1177669 out.go:252] * Updating the running docker "functional-240845" container ...
	I1218 00:19:34.340583 1177669 machine.go:94] provisionDockerMachine start ...
	I1218 00:19:34.340661 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.357593 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.357953 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.357966 1177669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:19:34.511862 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.511889 1177669 ubuntu.go:182] provisioning hostname "functional-240845"
	I1218 00:19:34.511951 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.530122 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.530421 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.530437 1177669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-240845 && echo "functional-240845" | sudo tee /etc/hostname
	I1218 00:19:34.693713 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.693796 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.711115 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.711437 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.711457 1177669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-240845' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-240845/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-240845' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:19:34.868676 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:19:34.868704 1177669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:19:34.868727 1177669 ubuntu.go:190] setting up certificates
	I1218 00:19:34.868737 1177669 provision.go:84] configureAuth start
	I1218 00:19:34.868796 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:34.885386 1177669 provision.go:143] copyHostCerts
	I1218 00:19:34.885436 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885473 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:19:34.885484 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885557 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:19:34.885647 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885670 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:19:34.885675 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885701 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:19:34.885784 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885802 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:19:34.885807 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885830 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:19:34.885882 1177669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-240845 san=[127.0.0.1 192.168.49.2 functional-240845 localhost minikube]
	I1218 00:19:35.070465 1177669 provision.go:177] copyRemoteCerts
	I1218 00:19:35.070558 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:19:35.070625 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.089175 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:35.196164 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:19:35.196247 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:19:35.213266 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:19:35.213323 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:19:35.231357 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:19:35.231416 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:19:35.249293 1177669 provision.go:87] duration metric: took 380.542312ms to configureAuth
	I1218 00:19:35.249372 1177669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:19:35.249565 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:35.249673 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.267176 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:35.267503 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:35.267526 1177669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:19:40.661888 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:19:40.661918 1177669 machine.go:97] duration metric: took 6.321326566s to provisionDockerMachine
	I1218 00:19:40.661929 1177669 start.go:293] postStartSetup for "functional-240845" (driver="docker")
	I1218 00:19:40.661947 1177669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:19:40.662006 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:19:40.662069 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.679665 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.787680 1177669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:19:40.790725 1177669 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:19:40.790745 1177669 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:19:40.790750 1177669 command_runner.go:130] > VERSION_ID="12"
	I1218 00:19:40.790757 1177669 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:19:40.790762 1177669 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:19:40.790766 1177669 command_runner.go:130] > ID=debian
	I1218 00:19:40.790771 1177669 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:19:40.790776 1177669 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:19:40.790785 1177669 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:19:40.790821 1177669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:19:40.790843 1177669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:19:40.790853 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:19:40.790906 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:19:40.790988 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:19:40.791003 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:19:40.791081 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:19:40.791089 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:19:40.791141 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:19:40.798177 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:19:40.814786 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:19:40.830892 1177669 start.go:296] duration metric: took 168.948549ms for postStartSetup
	I1218 00:19:40.831030 1177669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:19:40.831082 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.848091 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.952833 1177669 command_runner.go:130] > 13%
	I1218 00:19:40.953354 1177669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:19:40.957853 1177669 command_runner.go:130] > 171G
	I1218 00:19:40.958309 1177669 fix.go:56] duration metric: took 6.637921757s for fixHost
	I1218 00:19:40.958329 1177669 start.go:83] releasing machines lock for "functional-240845", held for 6.637966499s
	I1218 00:19:40.958394 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:40.975843 1177669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:19:40.975911 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.976173 1177669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:19:40.976254 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.995610 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.013560 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.099878 1177669 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:19:41.100025 1177669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:19:41.195326 1177669 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:19:41.198525 1177669 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:19:41.198598 1177669 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:19:41.198697 1177669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:19:41.321255 1177669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:19:41.326138 1177669 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:19:41.326216 1177669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:19:41.326312 1177669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:19:41.337406 1177669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:19:41.337470 1177669 start.go:496] detecting cgroup driver to use...
	I1218 00:19:41.337517 1177669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:19:41.337604 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:19:41.364732 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:19:41.395259 1177669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:19:41.395373 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:19:41.425216 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:19:41.453795 1177669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:19:41.688599 1177669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:19:41.909163 1177669 docker.go:234] disabling docker service ...
	I1218 00:19:41.909312 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:19:41.926883 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:19:41.943387 1177669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:19:42.156451 1177669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:19:42.449825 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:19:42.467750 1177669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:19:42.493864 1177669 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:19:42.495463 1177669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:19:42.495560 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.506971 1177669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:19:42.507118 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.518977 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.530876 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.539925 1177669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:19:42.553447 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.569558 1177669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.582698 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.597525 1177669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:19:42.608606 1177669 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:19:42.609612 1177669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:19:42.617962 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:19:42.846451 1177669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:21:13.130293 1177669 ssh_runner.go:235] Completed: sudo systemctl restart crio: (1m30.283808536s)
	I1218 00:21:13.130318 1177669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:21:13.130368 1177669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:21:13.134416 1177669 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:21:13.134438 1177669 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:21:13.134453 1177669 command_runner.go:130] > Device: 0,72	Inode: 804         Links: 1
	I1218 00:21:13.134460 1177669 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:13.134465 1177669 command_runner.go:130] > Access: 2025-12-18 00:21:13.087402358 +0000
	I1218 00:21:13.134471 1177669 command_runner.go:130] > Modify: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134475 1177669 command_runner.go:130] > Change: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134479 1177669 command_runner.go:130] >  Birth: -
	I1218 00:21:13.134836 1177669 start.go:564] Will wait 60s for crictl version
	I1218 00:21:13.134895 1177669 ssh_runner.go:195] Run: which crictl
	I1218 00:21:13.138647 1177669 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:21:13.138725 1177669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:21:13.167266 1177669 command_runner.go:130] > Version:  0.1.0
	I1218 00:21:13.167284 1177669 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:21:13.167289 1177669 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:21:13.167294 1177669 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:21:13.169251 1177669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:21:13.169347 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.194596 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.194618 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.194624 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.194629 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.194634 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.194639 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.194643 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.194656 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.194660 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.194671 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.194674 1177669 command_runner.go:130] >      static
	I1218 00:21:13.194678 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.194682 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.194686 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.194689 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.194693 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.194697 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.194701 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.194705 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.194709 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.196349 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.221274 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.221297 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.221302 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.221308 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.221313 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.221318 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.221321 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.221326 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.221331 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.221334 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.221338 1177669 command_runner.go:130] >      static
	I1218 00:21:13.221341 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.221345 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.221350 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.221353 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.221357 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.221360 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.221364 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.221369 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.221373 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.226046 1177669 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:21:13.228983 1177669 cli_runner.go:164] Run: docker network inspect functional-240845 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:21:13.244579 1177669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:21:13.248178 1177669 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:21:13.248440 1177669 kubeadm.go:884] updating cluster {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA API
ServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fal
se DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:21:13.248553 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:21:13.248613 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.282229 1177669 command_runner.go:130] > {
	I1218 00:21:13.282251 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.282256 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282265 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.282269 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282275 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.282279 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282283 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282294 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.282305 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.282308 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282313 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.282332 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282342 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282346 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282350 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282356 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.282364 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282370 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.282373 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282378 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282389 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.282398 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.282403 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282408 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.282415 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282422 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282426 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282434 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282444 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.282449 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282454 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.282462 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282466 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282475 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.282483 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.282491 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282495 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.282499 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282503 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282506 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282509 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282516 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.282523 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282528 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.282532 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282536 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282549 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.282557 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.282564 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282569 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.282578 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.282586 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282589 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282592 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282599 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.282606 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282611 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.282615 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282624 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282631 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.282643 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.282647 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282651 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.282658 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282661 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282665 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282669 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282676 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282680 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282698 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282709 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.282714 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282719 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.282726 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282729 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282737 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.282746 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.282751 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282755 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.282759 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282765 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282769 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282777 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282782 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282785 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282788 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282795 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.282802 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282807 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.282811 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282815 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282828 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.282836 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.282843 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282850 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.282853 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282862 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282865 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282869 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282873 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282883 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282887 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282894 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.282902 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282907 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.282910 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282913 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282922 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.282941 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.282952 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282957 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.282967 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282970 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282973 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282976 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282984 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.282999 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283004 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.283007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283010 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283018 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.283026 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.283029 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283036 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.283040 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283046 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.283054 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283061 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283065 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.283067 1177669 command_runner.go:130] >     },
	I1218 00:21:13.283071 1177669 command_runner.go:130] >     {
	I1218 00:21:13.283079 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.283084 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283089 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.283092 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283099 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283107 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.283116 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.283122 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283126 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.283129 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283133 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.283136 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283144 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283148 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.283155 1177669 command_runner.go:130] >     }
	I1218 00:21:13.283158 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.283161 1177669 command_runner.go:130] > }
	I1218 00:21:13.283336 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.283347 1177669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:21:13.283410 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.307800 1177669 command_runner.go:130] > {
	I1218 00:21:13.307819 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.307823 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307831 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.307836 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307841 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.307845 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307849 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307861 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.307869 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.307872 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307877 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.307881 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307886 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307889 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307893 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307899 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.307903 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307909 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.307912 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307921 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307929 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.307940 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.307943 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307947 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.307951 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307959 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307962 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307965 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307971 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.307975 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307980 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.307983 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307987 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307995 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.308003 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.308007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308011 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.308015 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308020 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308023 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308026 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308032 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.308036 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308042 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.308045 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308049 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308057 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.308065 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.308068 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308072 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.308080 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.308084 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308087 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308090 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308099 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.308103 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308108 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.308111 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308114 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308122 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.308129 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.308132 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308136 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.308140 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308143 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308146 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308149 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308153 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308156 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308159 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308165 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.308168 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308173 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.308176 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308180 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308188 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.308195 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.308198 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308202 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.308206 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308210 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308213 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308217 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308241 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308244 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308247 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308253 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.308262 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308269 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.308275 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308279 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308287 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.308295 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.308298 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308302 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.308306 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308309 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308312 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308316 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308319 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308323 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308325 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308332 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.308335 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308340 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.308343 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308347 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308354 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.308370 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.308374 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308377 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.308381 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308385 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308387 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308390 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308397 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.308400 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308405 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.308408 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308412 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308422 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.308430 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.308433 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308437 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.308440 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308444 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308447 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308450 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308455 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308458 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308461 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308468 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.308472 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308477 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.308480 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308484 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308491 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.308498 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.308501 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308505 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.308508 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308512 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.308515 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308518 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308522 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.308524 1177669 command_runner.go:130] >     }
	I1218 00:21:13.308527 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.308529 1177669 command_runner.go:130] > }
	I1218 00:21:13.310403 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.310424 1177669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:21:13.310432 1177669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.34.3 crio true true} ...
	I1218 00:21:13.310536 1177669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-240845 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:21:13.310619 1177669 ssh_runner.go:195] Run: crio config
	I1218 00:21:13.358161 1177669 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:21:13.358186 1177669 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:21:13.358194 1177669 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:21:13.358198 1177669 command_runner.go:130] > #
	I1218 00:21:13.358205 1177669 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:21:13.358212 1177669 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:21:13.358218 1177669 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:21:13.358229 1177669 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:21:13.358236 1177669 command_runner.go:130] > # reload'.
	I1218 00:21:13.358243 1177669 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:21:13.358250 1177669 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:21:13.358258 1177669 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:21:13.358264 1177669 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:21:13.358267 1177669 command_runner.go:130] > [crio]
	I1218 00:21:13.358273 1177669 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:21:13.358277 1177669 command_runner.go:130] > # containers images, in this directory.
	I1218 00:21:13.358820 1177669 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:21:13.358837 1177669 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:21:13.359435 1177669 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:21:13.359448 1177669 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:21:13.359935 1177669 command_runner.go:130] > # imagestore = ""
	I1218 00:21:13.359950 1177669 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:21:13.359963 1177669 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:21:13.360646 1177669 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:21:13.360660 1177669 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:21:13.360667 1177669 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:21:13.360961 1177669 command_runner.go:130] > # storage_option = [
	I1218 00:21:13.361308 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.361321 1177669 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:21:13.361334 1177669 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:21:13.361921 1177669 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:21:13.361934 1177669 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:21:13.361949 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:21:13.361954 1177669 command_runner.go:130] > # always happen on a node reboot
	I1218 00:21:13.362559 1177669 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:21:13.362583 1177669 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:21:13.362590 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:21:13.362595 1177669 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:21:13.363052 1177669 command_runner.go:130] > # version_file_persist = ""
	I1218 00:21:13.363067 1177669 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:21:13.363076 1177669 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:21:13.363680 1177669 command_runner.go:130] > # internal_wipe = true
	I1218 00:21:13.363702 1177669 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:21:13.363709 1177669 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:21:13.364349 1177669 command_runner.go:130] > # internal_repair = true
	I1218 00:21:13.364361 1177669 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:21:13.364368 1177669 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:21:13.364377 1177669 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:21:13.364926 1177669 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:21:13.364942 1177669 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:21:13.364946 1177669 command_runner.go:130] > [crio.api]
	I1218 00:21:13.364951 1177669 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:21:13.365581 1177669 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:21:13.365594 1177669 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:21:13.367685 1177669 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:21:13.367700 1177669 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:21:13.367706 1177669 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:21:13.367710 1177669 command_runner.go:130] > # stream_port = "0"
	I1218 00:21:13.367716 1177669 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:21:13.367723 1177669 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:21:13.367730 1177669 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:21:13.367745 1177669 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:21:13.367752 1177669 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:21:13.367762 1177669 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367766 1177669 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:21:13.367773 1177669 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:21:13.367780 1177669 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367784 1177669 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:21:13.367791 1177669 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:21:13.367802 1177669 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:21:13.367808 1177669 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:21:13.367814 1177669 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:21:13.367835 1177669 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367844 1177669 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:21:13.367853 1177669 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367861 1177669 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:21:13.367868 1177669 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:21:13.367879 1177669 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:21:13.367883 1177669 command_runner.go:130] > [crio.runtime]
	I1218 00:21:13.367893 1177669 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:21:13.367904 1177669 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:21:13.367916 1177669 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:21:13.367926 1177669 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:21:13.367934 1177669 command_runner.go:130] > # default_ulimits = [
	I1218 00:21:13.367937 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.367950 1177669 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:21:13.367958 1177669 command_runner.go:130] > # no_pivot = false
	I1218 00:21:13.367963 1177669 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:21:13.367974 1177669 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:21:13.367979 1177669 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:21:13.367988 1177669 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:21:13.367994 1177669 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:21:13.368004 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368012 1177669 command_runner.go:130] > # conmon = ""
	I1218 00:21:13.368015 1177669 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:21:13.368023 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:21:13.368026 1177669 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:21:13.368035 1177669 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:21:13.368044 1177669 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:21:13.368051 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368058 1177669 command_runner.go:130] > # conmon_env = [
	I1218 00:21:13.368061 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368070 1177669 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:21:13.368076 1177669 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:21:13.368084 1177669 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:21:13.368089 1177669 command_runner.go:130] > # default_env = [
	I1218 00:21:13.368092 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368098 1177669 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:21:13.368111 1177669 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:21:13.368119 1177669 command_runner.go:130] > # selinux = false
	I1218 00:21:13.368125 1177669 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:21:13.368136 1177669 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:21:13.368144 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368148 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.368159 1177669 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:21:13.368167 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368171 1177669 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:21:13.368178 1177669 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:21:13.368189 1177669 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:21:13.368199 1177669 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:21:13.368206 1177669 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:21:13.368212 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368217 1177669 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:21:13.368256 1177669 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:21:13.368261 1177669 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:21:13.368266 1177669 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:21:13.368280 1177669 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:21:13.368287 1177669 command_runner.go:130] > # blockio parameters.
	I1218 00:21:13.368292 1177669 command_runner.go:130] > # blockio_reload = false
	I1218 00:21:13.368298 1177669 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:21:13.368303 1177669 command_runner.go:130] > # irqbalance daemon.
	I1218 00:21:13.368311 1177669 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:21:13.368320 1177669 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:21:13.368327 1177669 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:21:13.368337 1177669 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:21:13.368347 1177669 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:21:13.368357 1177669 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:21:13.368365 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368369 1177669 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:21:13.368375 1177669 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:21:13.368382 1177669 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:21:13.368388 1177669 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:21:13.368396 1177669 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:21:13.368402 1177669 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:21:13.368412 1177669 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:21:13.368419 1177669 command_runner.go:130] > # will be added.
	I1218 00:21:13.368423 1177669 command_runner.go:130] > # default_capabilities = [
	I1218 00:21:13.368430 1177669 command_runner.go:130] > # 	"CHOWN",
	I1218 00:21:13.368434 1177669 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:21:13.368442 1177669 command_runner.go:130] > # 	"FSETID",
	I1218 00:21:13.368445 1177669 command_runner.go:130] > # 	"FOWNER",
	I1218 00:21:13.368457 1177669 command_runner.go:130] > # 	"SETGID",
	I1218 00:21:13.368461 1177669 command_runner.go:130] > # 	"SETUID",
	I1218 00:21:13.368479 1177669 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:21:13.368487 1177669 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:21:13.368490 1177669 command_runner.go:130] > # 	"KILL",
	I1218 00:21:13.368494 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368506 1177669 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:21:13.368515 1177669 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:21:13.368524 1177669 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:21:13.368531 1177669 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:21:13.368539 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368542 1177669 command_runner.go:130] > default_sysctls = [
	I1218 00:21:13.368547 1177669 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:21:13.368554 1177669 command_runner.go:130] > ]
	I1218 00:21:13.368563 1177669 command_runner.go:130] > # List of devices on the host that a
	I1218 00:21:13.368570 1177669 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:21:13.368577 1177669 command_runner.go:130] > # allowed_devices = [
	I1218 00:21:13.368580 1177669 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:21:13.368588 1177669 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:21:13.368594 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368603 1177669 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:21:13.368611 1177669 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:21:13.368618 1177669 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:21:13.368624 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368628 1177669 command_runner.go:130] > # additional_devices = [
	I1218 00:21:13.368633 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368639 1177669 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:21:13.368646 1177669 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:21:13.368649 1177669 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:21:13.368653 1177669 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:21:13.368664 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368673 1177669 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:21:13.368683 1177669 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:21:13.368701 1177669 command_runner.go:130] > # Defaults to false.
	I1218 00:21:13.368712 1177669 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:21:13.368719 1177669 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:21:13.368725 1177669 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:21:13.368734 1177669 command_runner.go:130] > # hooks_dir = [
	I1218 00:21:13.368739 1177669 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:21:13.368745 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368751 1177669 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:21:13.368761 1177669 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:21:13.368770 1177669 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:21:13.368773 1177669 command_runner.go:130] > #
	I1218 00:21:13.368780 1177669 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:21:13.368789 1177669 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:21:13.368795 1177669 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:21:13.368803 1177669 command_runner.go:130] > #
	I1218 00:21:13.368809 1177669 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:21:13.368818 1177669 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:21:13.368829 1177669 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:21:13.368846 1177669 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:21:13.368853 1177669 command_runner.go:130] > #
	I1218 00:21:13.368857 1177669 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:21:13.368866 1177669 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:21:13.368876 1177669 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:21:13.368880 1177669 command_runner.go:130] > # pids_limit = -1
	I1218 00:21:13.368886 1177669 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:21:13.368894 1177669 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:21:13.368904 1177669 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:21:13.368917 1177669 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:21:13.368923 1177669 command_runner.go:130] > # log_size_max = -1
	I1218 00:21:13.368931 1177669 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:21:13.368938 1177669 command_runner.go:130] > # log_to_journald = false
	I1218 00:21:13.368944 1177669 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:21:13.368949 1177669 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:21:13.368959 1177669 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:21:13.368968 1177669 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:21:13.368974 1177669 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:21:13.368981 1177669 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:21:13.368986 1177669 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:21:13.368993 1177669 command_runner.go:130] > # read_only = false
	I1218 00:21:13.369000 1177669 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:21:13.369009 1177669 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:21:13.369017 1177669 command_runner.go:130] > # live configuration reload.
	I1218 00:21:13.369020 1177669 command_runner.go:130] > # log_level = "info"
	I1218 00:21:13.369026 1177669 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:21:13.369031 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.369036 1177669 command_runner.go:130] > # log_filter = ""
	I1218 00:21:13.369043 1177669 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369052 1177669 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:21:13.369056 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369067 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369074 1177669 command_runner.go:130] > # uid_mappings = ""
	I1218 00:21:13.369084 1177669 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369093 1177669 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:21:13.369097 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369105 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369114 1177669 command_runner.go:130] > # gid_mappings = ""
	I1218 00:21:13.369120 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:21:13.369127 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369139 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369150 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369158 1177669 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:21:13.369165 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:21:13.369174 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369184 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369192 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369196 1177669 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:21:13.369208 1177669 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:21:13.369218 1177669 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:21:13.369224 1177669 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:21:13.369231 1177669 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:21:13.369238 1177669 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:21:13.369247 1177669 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:21:13.369256 1177669 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:21:13.369261 1177669 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:21:13.369265 1177669 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:21:13.369273 1177669 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:21:13.369279 1177669 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:21:13.369286 1177669 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:21:13.369293 1177669 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:21:13.369301 1177669 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:21:13.369310 1177669 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:21:13.369320 1177669 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:21:13.369326 1177669 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:21:13.369333 1177669 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:21:13.369339 1177669 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:21:13.369347 1177669 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:21:13.369351 1177669 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:21:13.369359 1177669 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:21:13.369363 1177669 command_runner.go:130] > # pinns_path = ""
	I1218 00:21:13.369368 1177669 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:21:13.369378 1177669 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:21:13.369382 1177669 command_runner.go:130] > # enable_criu_support = true
	I1218 00:21:13.369390 1177669 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:21:13.369400 1177669 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:21:13.369407 1177669 command_runner.go:130] > # enable_pod_events = false
	I1218 00:21:13.369414 1177669 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:21:13.369422 1177669 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:21:13.369426 1177669 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:21:13.369431 1177669 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:21:13.369443 1177669 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:21:13.369457 1177669 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:21:13.369465 1177669 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:21:13.369474 1177669 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:21:13.369481 1177669 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:21:13.369486 1177669 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:21:13.369492 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.369499 1177669 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:21:13.369509 1177669 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:21:13.369515 1177669 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:21:13.369521 1177669 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:21:13.369523 1177669 command_runner.go:130] > #
	I1218 00:21:13.369528 1177669 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:21:13.369536 1177669 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:21:13.369540 1177669 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:21:13.369548 1177669 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:21:13.369553 1177669 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:21:13.369561 1177669 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:21:13.369565 1177669 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:21:13.369574 1177669 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:21:13.369577 1177669 command_runner.go:130] > # monitor_env = []
	I1218 00:21:13.369585 1177669 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:21:13.369590 1177669 command_runner.go:130] > # allowed_annotations = []
	I1218 00:21:13.369595 1177669 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:21:13.369599 1177669 command_runner.go:130] > # no_sync_log = false
	I1218 00:21:13.369603 1177669 command_runner.go:130] > # default_annotations = {}
	I1218 00:21:13.369611 1177669 command_runner.go:130] > # stream_websockets = false
	I1218 00:21:13.369614 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.369664 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.369673 1177669 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:21:13.369680 1177669 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:21:13.369686 1177669 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:21:13.369697 1177669 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:21:13.369708 1177669 command_runner.go:130] > #   in $PATH.
	I1218 00:21:13.369718 1177669 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:21:13.369728 1177669 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:21:13.369735 1177669 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:21:13.369741 1177669 command_runner.go:130] > #   state.
	I1218 00:21:13.369747 1177669 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:21:13.369753 1177669 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:21:13.369759 1177669 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:21:13.369765 1177669 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:21:13.369774 1177669 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:21:13.369780 1177669 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:21:13.369789 1177669 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:21:13.369795 1177669 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:21:13.369805 1177669 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:21:13.369813 1177669 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:21:13.369820 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:21:13.369831 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:21:13.370100 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:21:13.370120 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:21:13.370129 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:21:13.370143 1177669 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:21:13.370151 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:21:13.370162 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:21:13.370169 1177669 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:21:13.370176 1177669 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:21:13.370187 1177669 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:21:13.370195 1177669 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:21:13.370206 1177669 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:21:13.370213 1177669 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:21:13.370219 1177669 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:21:13.370232 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:21:13.370239 1177669 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:21:13.370249 1177669 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:21:13.370266 1177669 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:21:13.370271 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:21:13.370283 1177669 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:21:13.370288 1177669 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:21:13.370295 1177669 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:21:13.370305 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:21:13.370313 1177669 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:21:13.370317 1177669 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:21:13.370329 1177669 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:21:13.370338 1177669 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:21:13.370350 1177669 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:21:13.370357 1177669 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:21:13.370367 1177669 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:21:13.370375 1177669 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:21:13.370388 1177669 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:21:13.370395 1177669 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:21:13.370408 1177669 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:21:13.370420 1177669 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:21:13.370425 1177669 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:21:13.370437 1177669 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:21:13.370445 1177669 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:21:13.370459 1177669 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:21:13.370466 1177669 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:21:13.370473 1177669 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:21:13.370485 1177669 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:21:13.370488 1177669 command_runner.go:130] > #
	I1218 00:21:13.370493 1177669 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:21:13.370496 1177669 command_runner.go:130] > #
	I1218 00:21:13.370506 1177669 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:21:13.370513 1177669 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:21:13.370516 1177669 command_runner.go:130] > #
	I1218 00:21:13.370525 1177669 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:21:13.370537 1177669 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:21:13.370545 1177669 command_runner.go:130] > #
	I1218 00:21:13.370553 1177669 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:21:13.370556 1177669 command_runner.go:130] > # feature.
	I1218 00:21:13.370563 1177669 command_runner.go:130] > #
	I1218 00:21:13.370569 1177669 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:21:13.370576 1177669 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:21:13.370587 1177669 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:21:13.370594 1177669 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:21:13.370600 1177669 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:21:13.370610 1177669 command_runner.go:130] > #
	I1218 00:21:13.370618 1177669 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:21:13.370625 1177669 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:21:13.370628 1177669 command_runner.go:130] > #
	I1218 00:21:13.370638 1177669 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:21:13.370644 1177669 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:21:13.370647 1177669 command_runner.go:130] > #
	I1218 00:21:13.370657 1177669 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:21:13.370664 1177669 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:21:13.370667 1177669 command_runner.go:130] > # limitation.
	I1218 00:21:13.370672 1177669 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:21:13.370680 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:21:13.370684 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.370688 1177669 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:21:13.370695 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.370699 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371091 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371100 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371106 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371111 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371151 1177669 command_runner.go:130] > allowed_annotations = [
	I1218 00:21:13.371159 1177669 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:21:13.371163 1177669 command_runner.go:130] > ]
	I1218 00:21:13.371167 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371172 1177669 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:21:13.371180 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:21:13.371184 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.371188 1177669 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:21:13.371224 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.371229 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371233 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371242 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371253 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371257 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371263 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371305 1177669 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:21:13.371314 1177669 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:21:13.371321 1177669 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:21:13.371342 1177669 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:21:13.371388 1177669 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:21:13.371402 1177669 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:21:13.371414 1177669 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:21:13.371421 1177669 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:21:13.371470 1177669 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:21:13.371479 1177669 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:21:13.371490 1177669 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:21:13.371528 1177669 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:21:13.371536 1177669 command_runner.go:130] > # Example:
	I1218 00:21:13.371546 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:21:13.371551 1177669 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:21:13.371556 1177669 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:21:13.371561 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:21:13.371569 1177669 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:21:13.371573 1177669 command_runner.go:130] > # cpushares = "5"
	I1218 00:21:13.371606 1177669 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:21:13.371613 1177669 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:21:13.371617 1177669 command_runner.go:130] > # cpulimit = "35"
	I1218 00:21:13.371620 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.371629 1177669 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:21:13.371636 1177669 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:21:13.371647 1177669 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:21:13.371690 1177669 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:21:13.371702 1177669 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:21:13.371713 1177669 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:21:13.371718 1177669 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:21:13.371726 1177669 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:21:13.371777 1177669 command_runner.go:130] > # Default value is set to true
	I1218 00:21:13.371785 1177669 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:21:13.371791 1177669 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:21:13.371796 1177669 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:21:13.371805 1177669 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:21:13.371846 1177669 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:21:13.371855 1177669 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:21:13.371869 1177669 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:21:13.371873 1177669 command_runner.go:130] > # timezone = ""
	I1218 00:21:13.371880 1177669 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:21:13.371883 1177669 command_runner.go:130] > #
	I1218 00:21:13.371923 1177669 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:21:13.371933 1177669 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:21:13.371937 1177669 command_runner.go:130] > [crio.image]
	I1218 00:21:13.371948 1177669 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:21:13.371953 1177669 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:21:13.371960 1177669 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:21:13.372001 1177669 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372008 1177669 command_runner.go:130] > # global_auth_file = ""
	I1218 00:21:13.372014 1177669 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:21:13.372020 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372029 1177669 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.372036 1177669 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:21:13.372043 1177669 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372052 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372057 1177669 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:21:13.372094 1177669 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:21:13.372111 1177669 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:21:13.372119 1177669 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:21:13.372125 1177669 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:21:13.372134 1177669 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:21:13.372140 1177669 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:21:13.372147 1177669 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:21:13.372187 1177669 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:21:13.372197 1177669 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:21:13.372204 1177669 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:21:13.372215 1177669 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:21:13.372270 1177669 command_runner.go:130] > # pinned_images = [
	I1218 00:21:13.372283 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372290 1177669 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:21:13.372301 1177669 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:21:13.372308 1177669 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:21:13.372319 1177669 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:21:13.372324 1177669 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:21:13.372362 1177669 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:21:13.372371 1177669 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:21:13.372384 1177669 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:21:13.372391 1177669 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:21:13.372402 1177669 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:21:13.372408 1177669 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:21:13.372414 1177669 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:21:13.372450 1177669 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:21:13.372460 1177669 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:21:13.372464 1177669 command_runner.go:130] > # changing them here.
	I1218 00:21:13.372475 1177669 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:21:13.372479 1177669 command_runner.go:130] > # insecure_registries = [
	I1218 00:21:13.372482 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372489 1177669 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:21:13.372498 1177669 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:21:13.372502 1177669 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:21:13.372541 1177669 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:21:13.372549 1177669 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:21:13.372559 1177669 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:21:13.372567 1177669 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:21:13.372572 1177669 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:21:13.372582 1177669 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:21:13.372591 1177669 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:21:13.372630 1177669 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:21:13.372638 1177669 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:21:13.372643 1177669 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:21:13.372650 1177669 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:21:13.372667 1177669 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:21:13.372672 1177669 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:21:13.372678 1177669 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:21:13.372721 1177669 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:21:13.372730 1177669 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:21:13.372735 1177669 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:21:13.372746 1177669 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:21:13.372750 1177669 command_runner.go:130] > # CNI plugins.
	I1218 00:21:13.372753 1177669 command_runner.go:130] > [crio.network]
	I1218 00:21:13.372759 1177669 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:21:13.372769 1177669 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:21:13.372773 1177669 command_runner.go:130] > # cni_default_network = ""
	I1218 00:21:13.372780 1177669 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:21:13.372837 1177669 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:21:13.372851 1177669 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:21:13.372856 1177669 command_runner.go:130] > # plugin_dirs = [
	I1218 00:21:13.372860 1177669 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:21:13.372863 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372867 1177669 command_runner.go:130] > # List of included pod metrics.
	I1218 00:21:13.372903 1177669 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:21:13.372909 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372923 1177669 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:21:13.372927 1177669 command_runner.go:130] > [crio.metrics]
	I1218 00:21:13.372933 1177669 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:21:13.372941 1177669 command_runner.go:130] > # enable_metrics = false
	I1218 00:21:13.372946 1177669 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:21:13.372951 1177669 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:21:13.372958 1177669 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:21:13.372999 1177669 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:21:13.373006 1177669 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:21:13.373010 1177669 command_runner.go:130] > # metrics_collectors = [
	I1218 00:21:13.373018 1177669 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:21:13.373023 1177669 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:21:13.373033 1177669 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:21:13.373037 1177669 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:21:13.373042 1177669 command_runner.go:130] > # 	"operations_total",
	I1218 00:21:13.373077 1177669 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:21:13.373084 1177669 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:21:13.373089 1177669 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:21:13.373093 1177669 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:21:13.373098 1177669 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:21:13.373106 1177669 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:21:13.373111 1177669 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:21:13.373115 1177669 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:21:13.373120 1177669 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:21:13.373133 1177669 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:21:13.373167 1177669 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:21:13.373176 1177669 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:21:13.373179 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.373190 1177669 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:21:13.373199 1177669 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:21:13.373205 1177669 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:21:13.373209 1177669 command_runner.go:130] > # metrics_port = 9090
	I1218 00:21:13.373214 1177669 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:21:13.373222 1177669 command_runner.go:130] > # metrics_socket = ""
	I1218 00:21:13.373425 1177669 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:21:13.373436 1177669 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:21:13.373448 1177669 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:21:13.373454 1177669 command_runner.go:130] > # certificate on any modification event.
	I1218 00:21:13.373457 1177669 command_runner.go:130] > # metrics_cert = ""
	I1218 00:21:13.373463 1177669 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:21:13.373472 1177669 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:21:13.373475 1177669 command_runner.go:130] > # metrics_key = ""
	I1218 00:21:13.373510 1177669 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:21:13.373518 1177669 command_runner.go:130] > [crio.tracing]
	I1218 00:21:13.373528 1177669 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:21:13.373538 1177669 command_runner.go:130] > # enable_tracing = false
	I1218 00:21:13.373545 1177669 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:21:13.373549 1177669 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:21:13.373560 1177669 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:21:13.373565 1177669 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:21:13.373569 1177669 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:21:13.373602 1177669 command_runner.go:130] > [crio.nri]
	I1218 00:21:13.373606 1177669 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:21:13.373614 1177669 command_runner.go:130] > # enable_nri = true
	I1218 00:21:13.373618 1177669 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:21:13.373623 1177669 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:21:13.373628 1177669 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:21:13.373632 1177669 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:21:13.373641 1177669 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:21:13.373646 1177669 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:21:13.373652 1177669 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:21:13.374323 1177669 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:21:13.374347 1177669 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:21:13.374353 1177669 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:21:13.374359 1177669 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:21:13.374369 1177669 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:21:13.374374 1177669 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:21:13.374384 1177669 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:21:13.374396 1177669 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:21:13.374400 1177669 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:21:13.374404 1177669 command_runner.go:130] > # - OCI hook injection
	I1218 00:21:13.374410 1177669 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:21:13.374419 1177669 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:21:13.374424 1177669 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:21:13.374429 1177669 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:21:13.374440 1177669 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:21:13.374447 1177669 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:21:13.374453 1177669 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:21:13.374461 1177669 command_runner.go:130] > #
	I1218 00:21:13.374470 1177669 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:21:13.374475 1177669 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:21:13.374481 1177669 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:21:13.374487 1177669 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:21:13.374497 1177669 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:21:13.374503 1177669 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:21:13.374508 1177669 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:21:13.374517 1177669 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:21:13.374520 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.374526 1177669 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:21:13.374532 1177669 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:21:13.374540 1177669 command_runner.go:130] > [crio.stats]
	I1218 00:21:13.374546 1177669 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:21:13.374552 1177669 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:21:13.374557 1177669 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:21:13.374567 1177669 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:21:13.374574 1177669 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:21:13.374578 1177669 command_runner.go:130] > # collection_period = 0
	I1218 00:21:13.375235 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337716712Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:21:13.375252 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337755529Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:21:13.375261 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337787676Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:21:13.375269 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337813217Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:21:13.375279 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337887603Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:21:13.375295 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.338323059Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:21:13.375307 1177669 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:21:13.375636 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:21:13.375654 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:21:13.375670 1177669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:21:13.375692 1177669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-240845 NodeName:functional-240845 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:21:13.375818 1177669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-240845"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:21:13.375897 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:21:13.382943 1177669 command_runner.go:130] > kubeadm
	I1218 00:21:13.382987 1177669 command_runner.go:130] > kubectl
	I1218 00:21:13.382992 1177669 command_runner.go:130] > kubelet
	I1218 00:21:13.383228 1177669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:21:13.383323 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:21:13.390563 1177669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (367 bytes)
	I1218 00:21:13.402469 1177669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:21:13.415695 1177669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2214 bytes)
	I1218 00:21:13.427935 1177669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:21:13.431432 1177669 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:21:13.431528 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:13.573724 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:13.587283 1177669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845 for IP: 192.168.49.2
	I1218 00:21:13.587308 1177669 certs.go:195] generating shared ca certs ...
	I1218 00:21:13.587325 1177669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:13.587468 1177669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:21:13.587523 1177669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:21:13.587535 1177669 certs.go:257] generating profile certs ...
	I1218 00:21:13.587627 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key
	I1218 00:21:13.587682 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key.83c30509
	I1218 00:21:13.587749 1177669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key
	I1218 00:21:13.587763 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:21:13.587778 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:21:13.587791 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:21:13.587807 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:21:13.587827 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:21:13.587840 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:21:13.587855 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:21:13.587866 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:21:13.587928 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:21:13.587965 1177669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:21:13.587976 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:21:13.588004 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:21:13.588031 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:21:13.588058 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:21:13.588108 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:21:13.588142 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.588156 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.588167 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.588757 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:21:13.607287 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:21:13.626005 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:21:13.643497 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:21:13.660653 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:21:13.677616 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1218 00:21:13.694313 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:21:13.711161 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:21:13.728011 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:21:13.745006 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:21:13.761771 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:21:13.778664 1177669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:21:13.791171 1177669 ssh_runner.go:195] Run: openssl version
	I1218 00:21:13.796833 1177669 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:21:13.797285 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.804618 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:21:13.812913 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816610 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816655 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816704 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.857240 1177669 command_runner.go:130] > 51391683
	I1218 00:21:13.857318 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:21:13.864756 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.871981 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:21:13.879459 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883023 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883055 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883126 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.923479 1177669 command_runner.go:130] > 3ec20f2e
	I1218 00:21:13.923967 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:21:13.931505 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.938743 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:21:13.946369 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950234 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950276 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950327 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.990419 1177669 command_runner.go:130] > b5213941
	I1218 00:21:13.990837 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:21:13.998401 1177669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003376 1177669 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003402 1177669 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:21:14.003409 1177669 command_runner.go:130] > Device: 259,1	Inode: 1327743     Links: 1
	I1218 00:21:14.003416 1177669 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:14.003422 1177669 command_runner.go:130] > Access: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003427 1177669 command_runner.go:130] > Modify: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003432 1177669 command_runner.go:130] > Change: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003438 1177669 command_runner.go:130] >  Birth: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003512 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:21:14.045243 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.045691 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:21:14.086658 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.086738 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:21:14.127897 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.128372 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:21:14.168626 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.169131 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:21:14.209194 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.209712 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:21:14.250333 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.250470 1177669 kubeadm.go:401] StartCluster: {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISer
verNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false
DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:21:14.250558 1177669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:21:14.250623 1177669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:21:14.277777 1177669 command_runner.go:130] > e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563
	I1218 00:21:14.277803 1177669 command_runner.go:130] > 0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c
	I1218 00:21:14.277811 1177669 command_runner.go:130] > 95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e
	I1218 00:21:14.277820 1177669 command_runner.go:130] > 1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5
	I1218 00:21:14.277826 1177669 command_runner.go:130] > 9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1
	I1218 00:21:14.277832 1177669 command_runner.go:130] > 3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad
	I1218 00:21:14.277838 1177669 command_runner.go:130] > cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15
	I1218 00:21:14.277846 1177669 command_runner.go:130] > 38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68
	I1218 00:21:14.277857 1177669 command_runner.go:130] > 1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b
	I1218 00:21:14.277868 1177669 command_runner.go:130] > 61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a
	I1218 00:21:14.277874 1177669 command_runner.go:130] > 98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe
	I1218 00:21:14.277883 1177669 command_runner.go:130] > 891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b
	I1218 00:21:14.277889 1177669 command_runner.go:130] > 2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de
	I1218 00:21:14.277898 1177669 command_runner.go:130] > b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06
	I1218 00:21:14.280281 1177669 cri.go:89] found id: "e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563"
	I1218 00:21:14.280303 1177669 cri.go:89] found id: "0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c"
	I1218 00:21:14.280308 1177669 cri.go:89] found id: "95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e"
	I1218 00:21:14.280312 1177669 cri.go:89] found id: "1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5"
	I1218 00:21:14.280315 1177669 cri.go:89] found id: "9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1"
	I1218 00:21:14.280319 1177669 cri.go:89] found id: "3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad"
	I1218 00:21:14.280323 1177669 cri.go:89] found id: "cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15"
	I1218 00:21:14.280326 1177669 cri.go:89] found id: "38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68"
	I1218 00:21:14.280329 1177669 cri.go:89] found id: "1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b"
	I1218 00:21:14.280337 1177669 cri.go:89] found id: "61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a"
	I1218 00:21:14.280343 1177669 cri.go:89] found id: "98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe"
	I1218 00:21:14.280347 1177669 cri.go:89] found id: "891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b"
	I1218 00:21:14.280355 1177669 cri.go:89] found id: "2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	I1218 00:21:14.280359 1177669 cri.go:89] found id: "b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06"
	I1218 00:21:14.280362 1177669 cri.go:89] found id: ""
	I1218 00:21:14.280415 1177669 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:21:14.291297 1177669 command_runner.go:130] ! time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	W1218 00:21:14.291357 1177669 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	I1218 00:21:14.291439 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:21:14.298396 1177669 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:21:14.298416 1177669 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:21:14.298422 1177669 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:21:14.298426 1177669 command_runner.go:130] > member
	I1218 00:21:14.299333 1177669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:21:14.299377 1177669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:21:14.299453 1177669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:21:14.306750 1177669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:21:14.307329 1177669 kubeconfig.go:125] found "functional-240845" server: "https://192.168.49.2:8441"
	I1218 00:21:14.308688 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.308922 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.310273 1177669 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:21:14.310295 1177669 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:21:14.310301 1177669 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:21:14.310306 1177669 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:21:14.310311 1177669 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:21:14.310598 1177669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:21:14.310964 1177669 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:21:14.321005 1177669 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:21:14.321038 1177669 kubeadm.go:602] duration metric: took 21.641512ms to restartPrimaryControlPlane
	I1218 00:21:14.321068 1177669 kubeadm.go:403] duration metric: took 70.601924ms to StartCluster
	I1218 00:21:14.321095 1177669 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.321175 1177669 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.321832 1177669 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.322054 1177669 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:21:14.322232 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:21:14.322270 1177669 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:21:14.322334 1177669 addons.go:70] Setting storage-provisioner=true in profile "functional-240845"
	I1218 00:21:14.322346 1177669 addons.go:239] Setting addon storage-provisioner=true in "functional-240845"
	W1218 00:21:14.322351 1177669 addons.go:248] addon storage-provisioner should already be in state true
	I1218 00:21:14.322373 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.322797 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.323222 1177669 addons.go:70] Setting default-storageclass=true in profile "functional-240845"
	I1218 00:21:14.323243 1177669 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-240845"
	I1218 00:21:14.323528 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.326222 1177669 out.go:179] * Verifying Kubernetes components...
	I1218 00:21:14.329298 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:14.352407 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.352567 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.353875 1177669 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:21:14.354521 1177669 addons.go:239] Setting addon default-storageclass=true in "functional-240845"
	W1218 00:21:14.354541 1177669 addons.go:248] addon default-storageclass should already be in state true
	I1218 00:21:14.354568 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.355010 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.357054 1177669 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.357084 1177669 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:21:14.357149 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.385892 1177669 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.385914 1177669 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:21:14.385974 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.412313 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.438252 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.538332 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:14.555412 1177669 node_ready.go:35] waiting up to 6m0s for node "functional-240845" to be "Ready" ...
	I1218 00:21:14.556627 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.558515 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:14.558665 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:14.559006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:14.569919 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.635955 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.636102 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.636146 1177669 retry.go:31] will retry after 274.076226ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.646979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.650760 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.650797 1177669 retry.go:31] will retry after 360.821893ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.911221 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.974464 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.974555 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.974595 1177669 retry.go:31] will retry after 225.739861ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.012854 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.055958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.056036 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.056342 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.079682 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.079793 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.079817 1177669 retry.go:31] will retry after 552.403697ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.200970 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.261673 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.261728 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.261746 1177669 retry.go:31] will retry after 669.780864ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.556091 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.632797 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.699577 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.699638 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.699664 1177669 retry.go:31] will retry after 634.295794ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.931763 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.990067 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.993514 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.993545 1177669 retry.go:31] will retry after 1.113615509s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.055858 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:16.334650 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:16.392078 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:16.395777 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.395856 1177669 retry.go:31] will retry after 558.474178ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.556248 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.556629 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:16.556701 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:16.955131 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:17.055832 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.055954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.056319 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:17.076617 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.076722 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.076755 1177669 retry.go:31] will retry after 1.676176244s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.108039 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:17.223472 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.223571 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.223606 1177669 retry.go:31] will retry after 1.165701868s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.556175 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.556607 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.056383 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.056458 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.056745 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.390333 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:18.466841 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.466880 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.466899 1177669 retry.go:31] will retry after 1.475434566s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.556290 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.556363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.556640 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.753095 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:18.817795 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.817871 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.817893 1177669 retry.go:31] will retry after 1.833170296s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:19.056294 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.056363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:19.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:19.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.556536 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.556903 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:19.943440 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:20.003817 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.008032 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.008069 1177669 retry.go:31] will retry after 3.979109659s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.056345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.056668 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.556404 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.556792 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.652153 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:20.711890 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.715639 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.715672 1177669 retry.go:31] will retry after 3.637109781s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:21.056958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.057040 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.057388 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:21.057444 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:21.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.555927 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.556005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.556330 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.056025 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.056094 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.056444 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.556246 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.556345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.556676 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:23.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:23.987349 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:24.051441 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.051487 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.051524 1177669 retry.go:31] will retry after 5.3171516s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.056654 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.056732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.057111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:24.353838 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:24.413422 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.413469 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.413487 1177669 retry.go:31] will retry after 3.340127313s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.555854 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.555928 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:26.056042 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.056124 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.056522 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:26.056585 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:26.556332 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.556411 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.056517 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.056589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.056942 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.555642 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.754507 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:27.812979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:27.813026 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:27.813045 1177669 retry.go:31] will retry after 6.95951013s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:28.056456 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.056550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:28.057006 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:28.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.055872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.368874 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:29.425933 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:29.429391 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.429423 1177669 retry.go:31] will retry after 6.711424265s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.555717 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.055742 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.055823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:30.556179 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:31.055879 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.055958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.056290 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:31.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.556084 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.056199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.555959 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.556028 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.556363 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:32.556413 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:33.055772 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.055844 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.056367 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:33.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.556178 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.055882 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.055955 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.556326 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.556397 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:34.556796 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:34.773144 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:34.829958 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:34.833899 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:34.833929 1177669 retry.go:31] will retry after 8.542321591s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:35.056329 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.056407 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.056770 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:35.556516 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.556605 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.556959 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.057369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.057701 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.141963 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:36.202438 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:36.202477 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.202496 1177669 retry.go:31] will retry after 7.758270018s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:37.055818 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.055893 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:37.056274 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:37.555746 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.555822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.055812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.056254 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.556149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:39.055837 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.056297 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:39.056352 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:39.556245 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.556663 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.056509 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.056606 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.056917 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.555706 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.055770 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.056183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.555731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:41.556201 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:42.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.055854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:42.556028 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.556119 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.556476 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.056276 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.056351 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.056698 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.377156 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:43.435792 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:43.439377 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.439408 1177669 retry.go:31] will retry after 18.255208537s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.556665 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.556738 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.557098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:43.557163 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:43.961544 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:44.047619 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:44.047656 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.047681 1177669 retry.go:31] will retry after 16.124184127s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.055890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.056245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:44.556158 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.556259 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.556606 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:45.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.057068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1218 00:21:45.555737 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.555812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.556144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:46.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:46.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:46.555906 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.556364 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.055801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.555784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.056189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:48.056258 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:48.555948 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.556021 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.556370 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.056069 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.056148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.056513 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.556531 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.556886 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:50.056538 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.056619 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.056964 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:50.057018 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:50.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.556116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.055815 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.055884 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.056198 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.555734 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:52.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:53.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.056005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.056346 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:53.556049 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.056030 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.056102 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.056454 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.556467 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.556542 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.556870 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:54.556927 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:55.055618 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.055704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.056046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:55.555616 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.555993 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.055712 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:57.055721 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:57.056210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:57.555892 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.555965 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.556299 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.555793 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.555877 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.556239 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:59.556292 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:00.055898 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.055985 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.056349 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:00.172859 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:00.349113 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:00.349165 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.349188 1177669 retry.go:31] will retry after 15.178958797s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.556482 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.556554 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.056710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.057020 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.555743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.695619 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:01.764253 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:01.768251 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:01.768286 1177669 retry.go:31] will retry after 20.261734519s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:02.055637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.055714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.056058 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:02.056113 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:02.555751 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.555820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.555659 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.555732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.556022 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:04.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.056080 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:04.056144 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:04.556235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.556662 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.056450 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.056522 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.056859 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.556627 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.556731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.557039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:06.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:06.056174 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:06.555712 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.556120 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.056150 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.555911 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.555990 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.556341 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:08.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.055811 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.056155 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:08.056203 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:08.555901 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.555978 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.556327 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:10.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.055797 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:10.056287 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:10.555954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.556049 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.556390 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.555929 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.056135 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:12.556162 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:13.055804 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.056174 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:13.555873 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.555943 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.056098 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.056468 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.556454 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.556529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.556828 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:14.556880 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:15.056592 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.056660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.057019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.528571 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:15.555957 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.556023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.556331 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.591509 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:15.594869 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:15.594900 1177669 retry.go:31] will retry after 30.932709272s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:16.056512 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.056582 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.056902 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:16.555621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.555718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:17.055743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.056124 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:17.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:17.555739 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.555834 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.055987 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.056365 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.556060 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.556132 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.556480 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:19.056235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.056623 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:19.056697 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:19.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.556660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.556999 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.056621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.056700 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.057051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.555764 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.556195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.055711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:21.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:22.030818 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:22.056348 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.056446 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.056751 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:22.091069 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:22.094766 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.094798 1177669 retry.go:31] will retry after 47.715756714s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.556528 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.556883 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.056081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.555699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.555792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:24.055868 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.055942 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.056302 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:24.056357 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:24.556255 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.556349 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.056263 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.056721 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.556539 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.556623 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.557021 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.055767 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.055851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.555952 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.556027 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:26.556419 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:27.056075 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:27.556423 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.556518 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.056638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.057038 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.555732 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.555814 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.556167 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:29.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:29.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:29.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.055846 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.055931 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.056335 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.556057 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.556129 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.556500 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:31.056282 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.056362 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.056704 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:31.056761 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:31.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.556564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.556896 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.055712 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.556248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.055753 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.556054 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.556161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.556684 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:33.556740 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:34.056460 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.056532 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.056854 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:34.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.556213 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.555889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.556242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:36.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.056089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:36.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:36.555705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.055826 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.056241 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.555756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:38.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.056147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:38.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:38.555776 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.556214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.056109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.555711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.555807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.556166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:40.055954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.056034 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.056481 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:40.056546 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:40.556379 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.556469 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.556814 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.056496 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.056800 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.556572 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.556672 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.557008 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.055744 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.056248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.555835 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.555911 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.556276 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:42.556330 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:43.056000 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.056095 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.056464 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:43.556247 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.556319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.056503 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.056852 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.556012 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.556109 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:44.556512 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:45.057726 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.057809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.058234 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:45.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.556053 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.556425 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.055813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.528809 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:46.556363 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.556430 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.556863 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:46.556912 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:46.592076 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592111 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592213 1177669 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:22:47.056004 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.056462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:47.556264 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.556334 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.556652 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.056410 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.056481 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.056790 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.556515 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.556589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.556921 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:48.556975 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:49.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.055730 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:49.555733 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.555821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.556169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.055866 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.055935 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.555707 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.555815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:51.056519 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.056627 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:51.057002 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:51.555636 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.555709 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.556029 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.055820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.555872 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.555945 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.055695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.055766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.056179 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.555862 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.555934 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:53.556377 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:54.056043 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.056137 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.056494 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:54.556337 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.556432 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.556763 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.056566 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.056640 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.056965 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.555663 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.556084 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:56.056189 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:56.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.556131 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.056558 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.056624 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.056923 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.556638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.556705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.556914 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.055638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.055992 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.556608 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.556858 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:58.556906 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:59.055615 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.055693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.055962 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:59.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.055787 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.055870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.056214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.555725 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:01.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:01.056324 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:01.555994 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.556068 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.556424 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.056333 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.556470 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.556547 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.055624 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.055721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.056047 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:03.556183 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:04.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:04.556084 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.556154 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.556510 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.056318 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.056386 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.056739 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.556502 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.556575 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.556888 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:05.556938 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:06.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.056072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:06.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.555824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.556163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.056617 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.056703 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.057017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.555759 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.556245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:08.055620 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.055732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:08.056120 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:08.555828 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.555904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.055992 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.056064 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.056490 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.556279 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.811086 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:23:09.870262 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873844 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873941 1177669 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:23:09.877215 1177669 out.go:179] * Enabled addons: 
	I1218 00:23:09.880843 1177669 addons.go:530] duration metric: took 1m55.558566134s for enable addons: enabled=[]
	I1218 00:23:10.056212 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.056346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.056713 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:10.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:10.556554 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.556967 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.055656 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.055785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.555809 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.555880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.556212 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.055838 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.556050 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:12.556108 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:13.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.055825 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.056171 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:13.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.556080 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.556462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.055821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.056182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.556315 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.556385 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:14.556741 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:15.056401 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.056470 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.056793 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:15.556411 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.556480 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.556780 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.056647 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.056963 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:17.055850 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.055925 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.056282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:17.056334 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:17.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.556122 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.556497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.056286 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.056361 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.056685 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.556408 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.556802 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:19.056570 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.056646 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.057040 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:19.057095 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:19.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.555860 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.555958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.556317 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.056081 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.056416 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.555830 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.555903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:21.556323 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:22.056015 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.056091 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.056432 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:22.556188 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.556285 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.556619 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.056387 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.056459 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.056805 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.556577 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.556991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:23.557043 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:24.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:24.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.556090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.556429 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.056237 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.056319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.056651 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.556407 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.556484 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.556804 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:26.056601 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.056678 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.057039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:26.057096 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:26.556349 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.556417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.556670 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.056529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.555635 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.555714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.556073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.556018 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.556350 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:28.556398 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:29.056066 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.056141 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.056558 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:29.556410 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.556482 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.556819 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.056192 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.056697 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.556347 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.556425 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.556813 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:30.556882 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:31.056651 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.056724 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.057110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:31.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:33.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:33.056171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:33.556520 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.556626 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.557595 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.055711 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.555832 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.555907 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.556266 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:35.055820 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.056262 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:35.056317 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:35.555980 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.556055 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.556475 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.056253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.056329 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.056689 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.556253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.556322 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.556585 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:37.056343 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.056417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.056777 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:37.056832 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:37.556557 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.556628 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.055768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.055671 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.056076 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.555915 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.555994 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:39.556347 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:40.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.056458 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:40.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.556320 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.556648 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.056467 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.056544 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.556710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:41.557023 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:42.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:42.555713 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.055805 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.055880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.056188 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:44.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.055912 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.056273 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:44.056335 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:44.555920 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.555993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.556368 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.058236 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.058319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.058728 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.556602 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.556934 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.055727 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.056067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.556151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:46.556209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:47.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.056162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:47.555674 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.055810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.556441 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.556514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.556832 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:48.556892 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:49.056604 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.056674 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.057002 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:49.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.555771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.055675 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:51.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:51.056177 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:51.555865 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.555937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.556310 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.555777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:53.055816 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.055886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.056231 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:53.056281 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:53.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.556010 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.556362 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.556026 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.556097 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.556417 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.055678 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.056101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.555734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:55.556129 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:56.055730 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:56.555867 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.555946 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.556300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.056090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.056457 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.556250 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.556323 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.556654 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:57.556712 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:58.056487 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:58.555623 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.055906 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.055982 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.056358 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.555710 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.555803 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:00.059640 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.059720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.060067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:00.060115 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:00.556240 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.556671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.056422 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.056490 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.056823 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.556575 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.556648 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.556984 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.056108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.555781 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:02.556145 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:03.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:03.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.555789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.056105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.556112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:04.556502 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:05.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.056331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.056671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:05.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.556557 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.055645 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.055718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.056059 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.555753 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.556081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:07.056275 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.056343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.056625 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:07.056668 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:07.556435 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.556511 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.055588 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.055660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.056003 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.055807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.055881 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.056246 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.555809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:09.556165 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:10.055865 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.055961 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.056303 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:10.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.556089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.055834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.056300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.556519 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:11.556574 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:12.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.056402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.056727 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:12.556537 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.556620 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.556973 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.055664 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.555826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.556183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:14.055916 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.055993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.056372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:14.056425 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:14.556306 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.556384 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.556722 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.056477 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.056546 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.056868 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.556714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.557060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.556138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:16.556191 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:17.055685 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.056060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:17.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.056016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.555699 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:19.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:19.056164 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:19.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.055623 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.055701 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.056014 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.555727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.055681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:21.556151 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:22.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.056666 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:22.556365 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.556434 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.556767 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.056542 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.056618 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.056908 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.555630 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.556032 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:24.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:24.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:24.556065 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.556148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.556518 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.056084 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.056512 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.556289 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.556691 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:26.056500 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.056600 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.056966 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:26.057019 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:26.555679 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.556107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.055812 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.556038 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.556103 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:28.556166 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:29.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.555778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.555730 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:30.556245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:31.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:31.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.055797 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.555678 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:33.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:33.056107 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:33.555768 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.556187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.055758 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.055833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.556178 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:35.056326 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.056396 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.056747 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:35.056801 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:35.556586 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.556657 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.557044 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.055782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.555908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.556282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:37.056627 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.057073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:37.057140 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:37.555781 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.555850 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.055869 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.055947 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.056284 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.055844 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.055924 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.056291 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:39.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:40.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:40.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.056055 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.555718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.556072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:42.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:42.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:42.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.556016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.556296 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.556369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:44.556718 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:45.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.057363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.057788 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:45.556579 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.556652 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.556974 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.555790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:47.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.055779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.056086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:47.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:47.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.555807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.555758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.556101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:49.556155 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:50.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.056148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:50.555821 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.555892 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.556250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.056032 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.056394 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:51.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:52.055852 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.056308 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:52.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.556071 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.556442 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.056207 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.056295 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.056620 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.556372 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.556448 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.556772 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:53.556829 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:54.056587 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.056664 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.057045 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:54.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.556382 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.056125 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:56.055825 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.055904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.056298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:56.056356 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:56.556003 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.556076 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.056092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.555906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.556260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:58.556314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:59.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.055748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:59.555874 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.555954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.055976 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.056062 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.056441 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.556138 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.556250 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.556688 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:00.556759 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:01.056530 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.056604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.056936 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:01.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.555731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.056275 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.555950 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.556022 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.556393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:03.056088 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.056161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.056527 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:03.056581 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:03.556314 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.556377 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.556644 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.056325 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.056398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.056741 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.556656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.556735 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.557093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.055643 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.556394 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.556464 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.556836 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:05.556889 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:06.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.057085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:06.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.056100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.556130 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:08.055819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.056255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:08.056314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:08.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.556052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.556389 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.556157 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.556549 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:10.056352 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.056426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.056786 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:10.056843 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:10.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.556658 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.557006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.055714 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.555890 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.555963 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.556324 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.056036 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.056112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.056497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.556273 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.556347 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:12.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:13.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.056492 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.056825 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:13.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.556340 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.556615 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.055978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.056067 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.555796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.556186 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:15.055759 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.055835 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.056169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:15.056244 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:15.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.556434 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.056196 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.056293 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.056642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.556550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.556891 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.055661 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.056001 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.556096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:17.556152 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:18.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:18.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.055840 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.055916 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.555697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:19.556192 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:20.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:20.555796 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.555870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.556255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.055952 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.056023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.056395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.556064 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.556138 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.556504 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:21.556557 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:22.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.056350 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.056700 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:22.556495 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.556573 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.556915 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.055663 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.055991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.555759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:24.055734 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.055816 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.056168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:24.056239 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:24.556073 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.556516 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:26.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.055868 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.056210 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:26.056286 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:26.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.556109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.055906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.056250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.555667 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.556156 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:28.556210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:29.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:29.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.056614 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.555633 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.555715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:31.055809 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.056307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:31.056368 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:31.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.055756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.555778 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.555851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:33.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.056351 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:33.056404 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:33.556040 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.556451 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.056004 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.056660 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.556087 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.555845 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.556288 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:35.556349 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:36.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:36.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.555888 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:38.055607 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.055689 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.056039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:38.056098 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:38.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.055853 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.056286 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.556210 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.556639 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:40.056445 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.056865 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:40.056930 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:40.556620 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.556695 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.557017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.555789 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.555863 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.556189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.056163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.555934 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.556328 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:42.556374 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:43.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:43.555816 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.556292 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.055998 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.056073 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.556595 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.556924 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:44.556979 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:45.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.056201 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:45.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.056492 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.056876 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.556642 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.556716 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.557036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:46.557089 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:47.055763 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.055839 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:47.555908 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.555986 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.556307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.055720 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.556093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:49.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.056114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:49.056169 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:49.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.055833 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.055910 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.056293 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.555974 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.556043 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:51.056056 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.056128 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.056465 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:51.056513 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:51.556270 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.556344 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.556681 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.056466 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.056539 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.056895 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.555612 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.555693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.556206 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.055909 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.055981 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:53.556175 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:54.055792 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.056260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:54.556080 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.556156 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.556472 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.555651 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.556079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:56.056196 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:56.555837 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.555913 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.056033 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.056356 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.056107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.555780 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.555854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.556190 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:58.556260 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:59.055926 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.056011 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:59.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.556343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.556642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.059082 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.059161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.059514 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.556566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.556913 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:00.556965 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:01.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.055719 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:01.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.555754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:03.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.056098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:03.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:03.555672 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.056115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.556167 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.556265 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.556617 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:05.056442 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.056514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:05.056907 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:05.556597 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.556667 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.556997 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.556092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.055659 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.055728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.056019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:07.556123 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:08.055665 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.055743 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.056061 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:08.555631 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.555705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.055707 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.055787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.555670 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.556065 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:10.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.055794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:10.056268 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:10.555749 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.555819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.556146 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.055855 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:12.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:13.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:13.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.555886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.556252 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.055957 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.056026 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.056385 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.556596 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.556938 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:14.556992 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:15.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:15.555805 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.555887 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.055807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.555813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:17.055842 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.055915 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.056281 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:17.056342 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:17.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.555824 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.555898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.556258 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.055724 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.555983 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.556056 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.556402 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:19.556458 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:20.056180 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.056281 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.056653 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:20.556480 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.556560 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.056634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.056717 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.057043 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:22.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.055734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.056082 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:22.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:22.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.556259 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.055950 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.056030 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.056393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.555649 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.556074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.055763 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.556096 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.556167 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.556536 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:24.556589 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:25.056122 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.056197 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.056567 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:25.556331 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.556402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.556737 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.056615 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.056954 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.555650 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:27.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:27.056160 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:27.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.555829 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.055860 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.055937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.556069 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.556395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:29.056209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:29.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.055732 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.055808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.056154 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.555757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:31.055778 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.055856 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:31.056278 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:31.555669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.555744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.555701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:33.055858 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.056301 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:33.056353 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:33.556038 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.556445 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.056214 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.056311 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.056650 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.556604 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.556677 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.557012 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:35.056645 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.056718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.057052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:35.057102 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:35.555605 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.555680 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.556018 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.055752 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.055826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.056172 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.556126 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.055668 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.056096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.555797 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.555867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.556203 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:37.556272 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:38.055962 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.056083 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.056495 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:38.556271 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.556346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.556695 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.055905 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.055976 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.056296 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.556206 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:39.556792 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:40.056703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.056787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.057218 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:40.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.555750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:42.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.055860 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:42.056301 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:42.555953 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.556025 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.556420 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.555853 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.556039 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.556118 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:44.556541 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:45.055766 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.055855 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:45.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.555793 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:47.055949 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.056052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.056438 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:47.056488 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:47.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.556324 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.556696 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.056448 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.056853 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.556536 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.556604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.556871 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:49.056619 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.056692 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.057011 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:49.057072 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:49.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.556115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.055723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.555726 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.555805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:51.556259 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:52.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.056215 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:52.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.056106 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.555801 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.556202 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:53.556285 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:54.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.056407 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:54.556321 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.556398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.557023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.055680 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.555795 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.555866 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.556181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:56.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.055805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.056166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:56.056245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:56.555703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.555700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.556119 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:58.556172 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:59.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.055903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:59.555918 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.555995 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.056095 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.056184 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.056542 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.556340 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.556426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.556768 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:00.556820 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:01.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.056617 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.056937 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:01.555665 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.056048 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.056120 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.056471 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.556307 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.556375 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:03.056489 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.056566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.056907 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:03.056959 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:03.556548 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.556616 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.556947 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.055614 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.055691 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.056023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.556067 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.556168 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.055755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.056077 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.556113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:05.556171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:06.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.056074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:06.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.555767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.055795 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.555673 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.556083 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:08.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.055846 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.056205 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:08.056297 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:08.555965 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.556379 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.055815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.056111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.556064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:10.556121 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:11.055789 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:11.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.055985 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.555638 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.555713 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.556042 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:13.055626 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.055698 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.056031 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:13.056081 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:13.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.556147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.055908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:14.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.555886 1177669 node_ready.go:38] duration metric: took 6m0.000394955s for node "functional-240845" to be "Ready" ...
	I1218 00:27:14.559015 1177669 out.go:203] 
	W1218 00:27:14.562031 1177669 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:27:14.562056 1177669 out.go:285] * 
	W1218 00:27:14.564187 1177669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:27:14.567133 1177669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.859940256Z" level=info msg="Checking image status: docker.io/library/minikube-local-cache-test:functional-240845" id=50cee970-2587-4094-b1a2-ff23f438e8df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.860074193Z" level=info msg="Image docker.io/library/minikube-local-cache-test:functional-240845 not found" id=50cee970-2587-4094-b1a2-ff23f438e8df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.860115291Z" level=info msg="Neither image nor artfiact docker.io/library/minikube-local-cache-test:functional-240845 found" id=50cee970-2587-4094-b1a2-ff23f438e8df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.883495023Z" level=info msg="Checking image status: localhost/library/minikube-local-cache-test:functional-240845" id=f5228d9e-42e3-4a56-a66a-9de44e1159c5 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.883655026Z" level=info msg="Image localhost/library/minikube-local-cache-test:functional-240845 not found" id=f5228d9e-42e3-4a56-a66a-9de44e1159c5 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.883717046Z" level=info msg="Neither image nor artfiact localhost/library/minikube-local-cache-test:functional-240845 found" id=f5228d9e-42e3-4a56-a66a-9de44e1159c5 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.846222226Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=dfdf6c19-4ccf-4476-8a2a-d15b86928295 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.962234333Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=0c43511b-deaa-409e-8891-f106b21b680f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.968806121Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=54131ea5-7906-4ff9-b836-aba473d77fa9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.973003314Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=f3a96d9e-d6db-4b6f-9fd4-f7c02579fb06 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.973231483Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.977920901Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=f3a96d9e-d6db-4b6f-9fd4-f7c02579fb06 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.198369007Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=0607fe01-271d-4608-bfc7-17c813d60240 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.198493278Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=0607fe01-271d-4608-bfc7-17c813d60240 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.198527311Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=0607fe01-271d-4608-bfc7-17c813d60240 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.86681905Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=de85b122-38be-4ae2-9043-e286fc87f238 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.86694314Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=de85b122-38be-4ae2-9043-e286fc87f238 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.866976583Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=de85b122-38be-4ae2-9043-e286fc87f238 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.890744668Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=25237d56-0e9c-42bd-8538-8483f52450e6 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.890901808Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=25237d56-0e9c-42bd-8538-8483f52450e6 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.890953064Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=25237d56-0e9c-42bd-8538-8483f52450e6 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.917313928Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=34af6f9f-a186-421e-82fe-12ea692c8fc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.917480183Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=34af6f9f-a186-421e-82fe-12ea692c8fc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.917537453Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=34af6f9f-a186-421e-82fe-12ea692c8fc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:28 functional-240845 crio[2318]: time="2025-12-18T00:27:28.476842553Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=3373db7b-8cec-4c7c-919a-c6e5d09ccb87 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE                                                              CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	3051bfe26a7bd       ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6   55 seconds ago      Exited              storage-provisioner       6                   552e688f4b2fb       storage-provisioner                         kube-system
	56af7390805be       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22   2 minutes ago       Exited              kube-controller-manager   5                   d9cddccbc36e9       kube-controller-manager-functional-240845   kube-system
	3df4b23cd1fc9       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   6 minutes ago       Running             kube-proxy                2                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	9b3fcd7bdcddc       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   6 minutes ago       Running             kindnet-cni               2                   2557f167a47ed       kindnet-84qbm                               kube-system
	fb962917a931f       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   6 minutes ago       Running             etcd                      2                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	f6d062f0f43f4       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   6 minutes ago       Running             kube-scheduler            2                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	45ca9ca01a676       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   6 minutes ago       Running             coredns                   2                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	e79c8e6ec8375       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   7 minutes ago       Exited              kube-proxy                1                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	0fe4c80fa2adf       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   7 minutes ago       Exited              kindnet-cni               1                   2557f167a47ed       kindnet-84qbm                               kube-system
	95d915f37e740       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   7 minutes ago       Exited              etcd                      1                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	9caeb1dccc679       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   7 minutes ago       Exited              kube-scheduler            1                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	cf507cc725a8d       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   7 minutes ago       Exited              coredns                   1                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	2b9f193a1520d       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896   8 minutes ago       Exited              kube-apiserver            0                   e04fd252da213       kube-apiserver-functional-240845            kube-system
	
	
	==> coredns [45ca9ca01a676570a0535560af08d4e95f72145d9702ec8b798ce70d833c0356] <==
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> coredns [cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15] <==
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] plugin/health: Going into lameduck mode for 5s
	[INFO] 127.0.0.1:42709 - 40478 "HINFO IN 7841480554586397634.8984575394038029725. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043241905s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e] <==
	{"level":"info","ts":"2025-12-18T00:19:42.456949Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.459892Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.464260Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:19:42.464361Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:19:42.473581Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:19:42.473686Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.512923Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.860469Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-12-18T00:19:42.860559Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-12-18T00:19:42.860705Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.862987Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.863085Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.863106Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"info","ts":"2025-12-18T00:19:42.863197Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-12-18T00:19:42.863210Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863328Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863350Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863361Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863401Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863409Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863417Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.876941Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-12-18T00:19:42.877021Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.877059Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:19:42.877083Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [fb962917a931fb777a305b1b6998e379972e4d38499641f5d582e94ff93708b1] <==
	{"level":"info","ts":"2025-12-18T00:21:16.766111Z","caller":"fileutil/purge.go:49","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2025-12-18T00:21:16.766332Z","caller":"embed/etcd.go:640","msg":"serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.766371Z","caller":"embed/etcd.go:611","msg":"cmux::serve","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.767189Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1981","msg":"aec36adc501070cc switched to configuration voters=(12593026477526642892)"}
	{"level":"info","ts":"2025-12-18T00:21:16.767293Z","caller":"membership/cluster.go:433","msg":"ignore already added member","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","added-peer-id":"aec36adc501070cc","added-peer-peer-urls":["https://192.168.49.2:2380"],"added-peer-is-learner":false}
	{"level":"info","ts":"2025-12-18T00:21:16.767389Z","caller":"membership/cluster.go:674","msg":"updated cluster version","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","from":"3.6","to":"3.6"}
	{"level":"info","ts":"2025-12-18T00:21:17.452278Z","logger":"raft","caller":"v3@v3.6.0/raft.go:988","msg":"aec36adc501070cc is starting a new election at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452423Z","logger":"raft","caller":"v3@v3.6.0/raft.go:930","msg":"aec36adc501070cc became pre-candidate at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452498Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgPreVoteResp from aec36adc501070cc at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452538Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.452580Z","logger":"raft","caller":"v3@v3.6.0/raft.go:912","msg":"aec36adc501070cc became candidate at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458217Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgVoteResp from aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458311Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.458356Z","logger":"raft","caller":"v3@v3.6.0/raft.go:970","msg":"aec36adc501070cc became leader at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458401Z","logger":"raft","caller":"v3@v3.6.0/node.go:370","msg":"raft.node: aec36adc501070cc elected leader aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.464392Z","caller":"etcdserver/server.go:1820","msg":"published local member to cluster through raft","local-member-id":"aec36adc501070cc","local-member-attributes":"{Name:functional-240845 ClientURLs:[https://192.168.49.2:2379]}","cluster-id":"fa54960ea34d58be","publish-timeout":"7s"}
	{"level":"info","ts":"2025-12-18T00:21:17.464576Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.465463Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.467639Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:21:17.469663Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.484256Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:21:17.484501Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:21:17.485336Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:21:17.490008Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.492585Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	
	
	==> kernel <==
	 00:27:30 up  7:09,  0 user,  load average: 0.07, 0.39, 1.10
	Linux functional-240845 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c] <==
	I1218 00:19:41.706085       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 00:19:41.706608       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1218 00:19:41.706796       1 main.go:148] setting mtu 1500 for CNI 
	I1218 00:19:41.706849       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 00:19:41.706886       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T00:19:41Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	E1218 00:19:41.884497       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	I1218 00:19:41.884891       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 00:19:41.884912       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 00:19:41.884921       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 00:19:41.885210       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1218 00:19:41.885327       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:19:41.885420       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:19:41.885714       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:19:42.810791       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	
	
	==> kindnet [9b3fcd7bdcddc7326e7d4c50ecf0ebeef85e8ebe52719009cafb599db42b74a4] <==
	E1218 00:23:15.308322       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:23:24.321432       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:23:30.188055       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:23:40.934269       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:23:50.555354       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:11.192172       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:24:18.295989       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:20.463220       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:24:45.334437       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:50.163434       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:56.363905       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:14.032706       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:25:16.121579       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:25:27.942524       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:48.335255       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:25:56.102936       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:05.238670       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:26:10.015965       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:26:40.630863       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:43.534436       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:26:50.773423       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:27:04.308560       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:27:16.542428       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:27:26.936089       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:27:27.587398       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	
	
	==> kube-apiserver [2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de] <==
	W1218 00:19:35.450481       1 logging.go:55] [core] [Channel #135 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450516       1 logging.go:55] [core] [Channel #2 SubChannel #6]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450551       1 logging.go:55] [core] [Channel #7 SubChannel #9]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450585       1 logging.go:55] [core] [Channel #26 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450620       1 logging.go:55] [core] [Channel #231 SubChannel #233]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451088       1 logging.go:55] [core] [Channel #131 SubChannel #133]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451140       1 logging.go:55] [core] [Channel #255 SubChannel #257]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451176       1 logging.go:55] [core] [Channel #163 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451218       1 logging.go:55] [core] [Channel #187 SubChannel #189]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	I1218 00:19:35.461000       1 controller.go:128] Shutting down kubernetes service endpoint reconciler
	W1218 00:19:35.463755       1 logging.go:55] [core] [Channel #139 SubChannel #141]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463898       1 logging.go:55] [core] [Channel #175 SubChannel #177]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463993       1 logging.go:55] [core] [Channel #235 SubChannel #237]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464077       1 logging.go:55] [core] [Channel #143 SubChannel #145]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464158       1 logging.go:55] [core] [Channel #199 SubChannel #201]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464191       1 logging.go:55] [core] [Channel #55 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464387       1 logging.go:55] [core] [Channel #91 SubChannel #93]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464473       1 logging.go:55] [core] [Channel #103 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464539       1 logging.go:55] [core] [Channel #223 SubChannel #225]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464601       1 logging.go:55] [core] [Channel #107 SubChannel #109]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464323       1 logging.go:55] [core] [Channel #155 SubChannel #157]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464353       1 logging.go:55] [core] [Channel #99 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464694       1 logging.go:55] [core] [Channel #227 SubChannel #229]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464564       1 logging.go:55] [core] [Channel #247 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90] <==
	I1218 00:24:41.593979       1 serving.go:386] Generated self-signed cert in-memory
	I1218 00:24:44.118744       1 controllermanager.go:191] "Starting" version="v1.34.3"
	I1218 00:24:44.118773       1 controllermanager.go:193] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 00:24:44.120180       1 dynamic_cafile_content.go:161] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I1218 00:24:44.120356       1 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1218 00:24:44.120599       1 secure_serving.go:211] Serving securely on 127.0.0.1:10257
	I1218 00:24:44.120986       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E1218 00:24:54.122993       1 controllermanager.go:245] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.49.2:8441/healthz\": dial tcp 192.168.49.2:8441: connect: connection refused"
	
	
	==> kube-proxy [3df4b23cd1fc91cb6876fab74b357bb139f1ea48223b502c7dd9c80ea84c8387] <==
	I1218 00:21:20.402926       1 server_linux.go:53] "Using iptables proxy"
	I1218 00:21:20.489835       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1218 00:21:20.490690       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:21.505281       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:24.355725       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:29.225897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:37.736765       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:52.064415       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:22:27.480420       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:23:19.153551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:24:11.289733       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:05.391565       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:37.545166       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:16.451554       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:57.580830       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:27:28.661385       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	
	
	==> kube-proxy [e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563] <==
	
	
	==> kube-scheduler [9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1] <==
	I1218 00:19:42.975868       1 serving.go:386] Generated self-signed cert in-memory
	W1218 00:19:43.469835       1 authentication.go:397] Error looking up in-cluster authentication configuration: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": dial tcp 192.168.49.2:8441: connect: connection refused
	W1218 00:19:43.469867       1 authentication.go:398] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1218 00:19:43.469874       1 authentication.go:399] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1218 00:19:43.478778       1 server.go:175] "Starting Kubernetes Scheduler" version="v1.34.3"
	I1218 00:19:43.478807       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	E1218 00:19:43.478827       1 event.go:401] "Unable start event watcher (will not retry!)" err="broadcaster already stopped"
	I1218 00:19:43.480968       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481035       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481365       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	E1218 00:19:43.481433       1 server.go:286] "handlers are not fully synchronized" err="context canceled"
	E1218 00:19:43.481498       1 shared_informer.go:352] "Unable to sync caches" logger="UnhandledError" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481515       1 configmap_cafile_content.go:213] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481533       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 00:19:43.481546       1 tlsconfig.go:258] "Shutting down DynamicServingCertificateController"
	I1218 00:19:43.481689       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1218 00:19:43.481706       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1218 00:19:43.481710       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1218 00:19:43.481721       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [f6d062f0f43f4922799fb3880d16e341783d4d7d586d7db4a50fb1085ef76e6e] <==
	E1218 00:26:28.967293       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:26:33.949400       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 00:26:35.154591       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:26:40.179897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:26:42.511252       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1218 00:26:44.107019       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:26:46.364045       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1218 00:26:51.526793       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 00:26:53.205541       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:26:53.282506       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1218 00:26:56.102986       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: Get \"https://192.168.49.2:8441/api/v1/replicationcontrollers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 00:26:58.988588       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:27:00.599087       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 00:27:06.721194       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:27:09.382892       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:27:10.196478       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:27:11.331530       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:27:17.829842       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:27:20.022138       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:27:21.018254       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:27:21.029006       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:27:22.672836       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 00:27:28.255997       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:27:28.760051       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:27:29.678535       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	
	
	==> kubelet <==
	Dec 18 00:27:17 functional-240845 kubelet[1315]: E1218 00:27:17.596063    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.201701    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a\\\",\\\"docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1\\\",\\\"docker.io/kinde
st/kindnetd:v20250512-df8de77b\\\"],\\\"sizeBytes\\\":111333938},{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae\\\",\\\"docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3\\\",\\\"docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88\\\"],\\\"sizeBytes\\\":108362109},{\\\"names\\\":[\\\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\\\",\\\"registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5\\\",\\\"registry.k8s.io/kube-apiserver:v1.34.3\\\"],\\\"sizeBytes\\\":84818927},{\\\"names\\\":[\\\"registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86\\\",\\\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\\\",\\\"registry.k8s.io/kube-proxy:v1.34.3\\\"],\\\"sizeBytes\\\":75941783},{\\\"names
\\\":[\\\"registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789\\\",\\\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\\\",\\\"registry.k8s.io/coredns/coredns:v1.12.1\\\"],\\\"sizeBytes\\\":73195387},{\\\"names\\\":[\\\"registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1\\\",\\\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\\\",\\\"registry.k8s.io/kube-controller-manager:v1.34.3\\\"],\\\"sizeBytes\\\":72629077},{\\\"names\\\":[\\\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\\\",\\\"registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e\\\",\\\"registry.k8s.io/etcd:3.6.5-0\\\"],\\\"sizeBytes\\\":60857170},{\\\"names\\\":[\\\"registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf
0ebc01098ef92965cc371eabcb9611\\\",\\\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\\\",\\\"registry.k8s.io/kube-scheduler:v1.34.3\\\"],\\\"sizeBytes\\\":51592021},{\\\"names\\\":[\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2\\\",\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944\\\",\\\"gcr.io/k8s-minikube/storage-provisioner:v5\\\"],\\\"sizeBytes\\\":29037500},{\\\"names\\\":[\\\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\\\",\\\"registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f\\\",\\\"registry.k8s.io/pause:3.10.1\\\"],\\\"sizeBytes\\\":519884}]}}\" for node \"functional-240845\": Patch \"https://192.168.49.2:8441/api/v1/nodes/functional-240845/status?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.202422    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.202727    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.202997    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.203280    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.203304    1315 kubelet_node_status.go:473] "Unable to update node status" err="update node status exceeds retry count"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: I1218 00:27:22.960675    1315 scope.go:117] "RemoveContainer" containerID="3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.960688    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/storage-provisioner\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="36dc300a-a099-40d7-874e-e5c2b3795445" pod="kube-system/storage-provisioner"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.960831    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(36dc300a-a099-40d7-874e-e5c2b3795445)\"" pod="kube-system/storage-provisioner" podUID="36dc300a-a099-40d7-874e-e5c2b3795445"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.960880    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/etcd-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="9257aaeefd3fa4168607b7fbbc0bc32d" pod="kube-system/etcd-functional-240845"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961026    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-apiserver-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="deb3e5bf338d69244d476364f7618b54" pod="kube-system/kube-apiserver-functional-240845"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961171    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-controller-manager-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="6aa5c667ab761331e5a16029bab33485" pod="kube-system/kube-controller-manager-functional-240845"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961325    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-scheduler-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="8e5e0ee0f3cd0bbcd38493dce832a8ff" pod="kube-system/kube-scheduler-functional-240845"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961551    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-proxy-kr6r5\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="86ad3ff0-4da0-4019-8dc4-c0b794c26b01" pod="kube-system/kube-proxy-kr6r5"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961703    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-mrclk\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="39971787-690f-4cc8-814a-be70de00c6a9" pod="kube-system/coredns-66bc5c9577-mrclk"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961846    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kindnet-84qbm\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="046ced09-dec4-43cb-848e-b84560229897" pod="kube-system/kindnet-84qbm"
	Dec 18 00:27:24 functional-240845 kubelet[1315]: E1218 00:27:24.597369    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: I1218 00:27:26.961555    1315 scope.go:117] "RemoveContainer" containerID="2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: E1218 00:27:26.978201    1315 log.go:32] "CreateContainer in sandbox from runtime service failed" err="rpc error: code = Unknown desc = the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" podSandboxID="e04fd252da21318ab96dfa8b10e5404c17e6ae263ccbb9e9f922d43a78607f1a"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: E1218 00:27:26.978295    1315 kuberuntime_manager.go:1449] "Unhandled Error" err="container kube-apiserver start failed in pod kube-apiserver-functional-240845_kube-system(deb3e5bf338d69244d476364f7618b54): CreateContainerError: the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" logger="UnhandledError"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: E1218 00:27:26.978334    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CreateContainerError: \"the container name \\\"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\\\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use\"" pod="kube-system/kube-apiserver-functional-240845" podUID="deb3e5bf338d69244d476364f7618b54"
	Dec 18 00:27:27 functional-240845 kubelet[1315]: E1218 00:27:27.260012    1315 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/events/kube-scheduler-functional-240845.18822743ba5c43bb\": dial tcp 192.168.49.2:8441: connect: connection refused" event="&Event{ObjectMeta:{kube-scheduler-functional-240845.18822743ba5c43bb  kube-system    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-scheduler-functional-240845,UID:8e5e0ee0f3cd0bbcd38493dce832a8ff,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://127.0.0.1:10259/readyz\": dial tcp 127.0.0.1:10259: connect: connection refused,Source:EventSource{Component:kubelet,Host:functional-240845,},FirstTimestamp:2025-12-18 00:19:35.725556667 +0000 UTC m=+22.878905798,LastTimestamp:2025-12-18 00:19:36.72607
7626 +0000 UTC m=+23.879426766,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:functional-240845,}"
	Dec 18 00:27:29 functional-240845 kubelet[1315]: I1218 00:27:29.960690    1315 scope.go:117] "RemoveContainer" containerID="56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90"
	Dec 18 00:27:29 functional-240845 kubelet[1315]: E1218 00:27:29.960839    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=kube-controller-manager-functional-240845_kube-system(6aa5c667ab761331e5a16029bab33485)\"" pod="kube-system/kube-controller-manager-functional-240845" podUID="6aa5c667ab761331e5a16029bab33485"
	
	
	==> storage-provisioner [3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0] <==
	I1218 00:26:34.997517       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1218 00:26:34.998962       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845: exit status 2 (336.103811ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-240845" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctional/serial/MinikubeKubectlCmd (3.01s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (3.17s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-240845 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-240845 get pods: exit status 1 (110.248467ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-240845 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctional/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctional/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-240845
helpers_test.go:244: (dbg) docker inspect functional-240845:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	        "Created": "2025-12-18T00:18:49.336039923Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1175534,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:18:49.397861382Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hostname",
	        "HostsPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/hosts",
	        "LogPath": "/var/lib/docker/containers/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2/5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2-json.log",
	        "Name": "/functional-240845",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-240845:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-240845",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "5d3e3e2a238b3981684a05427b97919f702c9e45432ffa0884841a91ad78d3b2",
	                "LowerDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/merged",
	                "UpperDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/diff",
	                "WorkDir": "/var/lib/docker/overlay2/c1cd691f3eadbba936182f90812edff5e18ba857530295e02293110959e1da44/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-240845",
	                "Source": "/var/lib/docker/volumes/functional-240845/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-240845",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-240845",
	                "name.minikube.sigs.k8s.io": "functional-240845",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "80ff640c2f3e079a9c83df8e9e88ea18985e04567ee70a1bf3deb87b69d7a9ef",
	            "SandboxKey": "/var/run/docker/netns/80ff640c2f3e",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33920"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33921"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33924"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33922"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33923"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-240845": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "f2:33:56:5f:da:77",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3f9ded1bec62ca4e0acc6643285f4a8aef2088de15bf9d1e6dbf478246c82ae7",
	                    "EndpointID": "a267c79a59d712dbf268b4db11b833499096e030f2777b578bf84c7f9519c961",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-240845",
	                        "5d3e3e2a238b"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-240845 -n functional-240845: exit status 2 (316.516788ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctional/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctional/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 logs -n 25: (1.614477434s)
helpers_test.go:261: TestFunctional/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                     ARGS                                                      │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ pause   │ nospam-499800 --log_dir /tmp/nospam-499800 pause                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                            │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                            │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ unpause │ nospam-499800 --log_dir /tmp/nospam-499800 unpause                                                            │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │                     │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                               │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                               │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ stop    │ nospam-499800 --log_dir /tmp/nospam-499800 stop                                                               │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ delete  │ -p nospam-499800                                                                                              │ nospam-499800     │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:18 UTC │
	│ start   │ -p functional-240845 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:18 UTC │ 18 Dec 25 00:19 UTC │
	│ start   │ -p functional-240845 --alsologtostderr -v=8                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:19 UTC │                     │
	│ cache   │ functional-240845 cache add registry.k8s.io/pause:3.1                                                         │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache add registry.k8s.io/pause:3.3                                                         │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache add registry.k8s.io/pause:latest                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache add minikube-local-cache-test:functional-240845                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ functional-240845 cache delete minikube-local-cache-test:functional-240845                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                              │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ list                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl images                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl rmi registry.k8s.io/pause:latest                                            │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │                     │
	│ cache   │ functional-240845 cache reload                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ ssh     │ functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                              │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                           │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │ 18 Dec 25 00:27 UTC │
	│ kubectl │ functional-240845 kubectl -- --context functional-240845 get pods                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:27 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:19:34
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:19:34.105121 1177669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:19:34.105346 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105377 1177669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:19:34.105397 1177669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:19:34.105673 1177669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:19:34.106120 1177669 out.go:368] Setting JSON to false
	I1218 00:19:34.107069 1177669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25322,"bootTime":1765991852,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:19:34.107165 1177669 start.go:143] virtualization:  
	I1218 00:19:34.110567 1177669 out.go:179] * [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:19:34.114275 1177669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:19:34.114378 1177669 notify.go:221] Checking for updates...
	I1218 00:19:34.120029 1177669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:19:34.122925 1177669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:19:34.125751 1177669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:19:34.128638 1177669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:19:34.131461 1177669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:19:34.134887 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:34.134985 1177669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:19:34.159427 1177669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:19:34.159542 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.223972 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.214884618 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.224090 1177669 docker.go:319] overlay module found
	I1218 00:19:34.227220 1177669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:19:34.229963 1177669 start.go:309] selected driver: docker
	I1218 00:19:34.229985 1177669 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false D
isableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.230103 1177669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:19:34.230199 1177669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:19:34.285040 1177669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:19:34.2764408 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:19:34.285449 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:19:34.285507 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:19:34.285561 1177669 start.go:353] cluster config:
	{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetC
lientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:19:34.290381 1177669 out.go:179] * Starting "functional-240845" primary control-plane node in "functional-240845" cluster
	I1218 00:19:34.293210 1177669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:19:34.297960 1177669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:19:34.300783 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:19:34.300829 1177669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:19:34.300855 1177669 cache.go:65] Caching tarball of preloaded images
	I1218 00:19:34.300881 1177669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:19:34.300940 1177669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:19:34.300950 1177669 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 00:19:34.301056 1177669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/config.json ...
	I1218 00:19:34.320164 1177669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:19:34.320186 1177669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:19:34.320203 1177669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:19:34.320279 1177669 start.go:360] acquireMachinesLock for functional-240845: {Name:mk3ed718f4cde9dd7b19ef8d5bcd86c3175b5067 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:19:34.320350 1177669 start.go:364] duration metric: took 45.89µs to acquireMachinesLock for "functional-240845"
	I1218 00:19:34.320375 1177669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:19:34.320383 1177669 fix.go:54] fixHost starting: 
	I1218 00:19:34.320643 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:19:34.337200 1177669 fix.go:112] recreateIfNeeded on functional-240845: state=Running err=<nil>
	W1218 00:19:34.337231 1177669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:19:34.340534 1177669 out.go:252] * Updating the running docker "functional-240845" container ...
	I1218 00:19:34.340583 1177669 machine.go:94] provisionDockerMachine start ...
	I1218 00:19:34.340661 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.357593 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.357953 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.357966 1177669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:19:34.511862 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.511889 1177669 ubuntu.go:182] provisioning hostname "functional-240845"
	I1218 00:19:34.511951 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.530122 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.530421 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.530437 1177669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-240845 && echo "functional-240845" | sudo tee /etc/hostname
	I1218 00:19:34.693713 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-240845
	
	I1218 00:19:34.693796 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:34.711115 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:34.711437 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:34.711457 1177669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-240845' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-240845/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-240845' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:19:34.868676 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:19:34.868704 1177669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:19:34.868727 1177669 ubuntu.go:190] setting up certificates
	I1218 00:19:34.868737 1177669 provision.go:84] configureAuth start
	I1218 00:19:34.868796 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:34.885386 1177669 provision.go:143] copyHostCerts
	I1218 00:19:34.885436 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885473 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:19:34.885484 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:19:34.885557 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:19:34.885647 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885670 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:19:34.885675 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:19:34.885701 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:19:34.885784 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885802 1177669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:19:34.885807 1177669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:19:34.885830 1177669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:19:34.885882 1177669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-240845 san=[127.0.0.1 192.168.49.2 functional-240845 localhost minikube]
	I1218 00:19:35.070465 1177669 provision.go:177] copyRemoteCerts
	I1218 00:19:35.070558 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:19:35.070625 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.089175 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:35.196164 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:19:35.196247 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:19:35.213266 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:19:35.213323 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:19:35.231357 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:19:35.231416 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:19:35.249293 1177669 provision.go:87] duration metric: took 380.542312ms to configureAuth
	I1218 00:19:35.249372 1177669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:19:35.249565 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:19:35.249673 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:35.267176 1177669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:19:35.267503 1177669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33920 <nil> <nil>}
	I1218 00:19:35.267526 1177669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:19:40.661888 1177669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:19:40.661918 1177669 machine.go:97] duration metric: took 6.321326566s to provisionDockerMachine
	I1218 00:19:40.661929 1177669 start.go:293] postStartSetup for "functional-240845" (driver="docker")
	I1218 00:19:40.661947 1177669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:19:40.662006 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:19:40.662069 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.679665 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.787680 1177669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:19:40.790725 1177669 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:19:40.790745 1177669 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:19:40.790750 1177669 command_runner.go:130] > VERSION_ID="12"
	I1218 00:19:40.790757 1177669 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:19:40.790762 1177669 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:19:40.790766 1177669 command_runner.go:130] > ID=debian
	I1218 00:19:40.790771 1177669 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:19:40.790776 1177669 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:19:40.790785 1177669 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:19:40.790821 1177669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:19:40.790843 1177669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:19:40.790853 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:19:40.790906 1177669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:19:40.790988 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:19:40.791003 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:19:40.791081 1177669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:19:40.791089 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:19:40.791141 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:19:40.798177 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:19:40.814786 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:19:40.830892 1177669 start.go:296] duration metric: took 168.948549ms for postStartSetup
	I1218 00:19:40.831030 1177669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:19:40.831082 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.848091 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:40.952833 1177669 command_runner.go:130] > 13%
	I1218 00:19:40.953354 1177669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:19:40.957853 1177669 command_runner.go:130] > 171G
	I1218 00:19:40.958309 1177669 fix.go:56] duration metric: took 6.637921757s for fixHost
	I1218 00:19:40.958329 1177669 start.go:83] releasing machines lock for "functional-240845", held for 6.637966499s
	I1218 00:19:40.958394 1177669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-240845
	I1218 00:19:40.975843 1177669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:19:40.975911 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.976173 1177669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:19:40.976254 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:19:40.995610 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.013560 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:19:41.099878 1177669 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:19:41.100025 1177669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:19:41.195326 1177669 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:19:41.198525 1177669 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:19:41.198598 1177669 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:19:41.198697 1177669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:19:41.321255 1177669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:19:41.326138 1177669 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:19:41.326216 1177669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:19:41.326312 1177669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:19:41.337406 1177669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:19:41.337470 1177669 start.go:496] detecting cgroup driver to use...
	I1218 00:19:41.337517 1177669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:19:41.337604 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:19:41.364732 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:19:41.395259 1177669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:19:41.395373 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:19:41.425216 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:19:41.453795 1177669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:19:41.688599 1177669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:19:41.909163 1177669 docker.go:234] disabling docker service ...
	I1218 00:19:41.909312 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:19:41.926883 1177669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:19:41.943387 1177669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:19:42.156451 1177669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:19:42.449825 1177669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:19:42.467750 1177669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:19:42.493864 1177669 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:19:42.495463 1177669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:19:42.495560 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.506971 1177669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:19:42.507118 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.518977 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.530876 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.539925 1177669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:19:42.553447 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.569558 1177669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.582698 1177669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:19:42.597525 1177669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:19:42.608606 1177669 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:19:42.609612 1177669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:19:42.617962 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:19:42.846451 1177669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:21:13.130293 1177669 ssh_runner.go:235] Completed: sudo systemctl restart crio: (1m30.283808536s)
	I1218 00:21:13.130318 1177669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:21:13.130368 1177669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:21:13.134416 1177669 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:21:13.134438 1177669 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:21:13.134453 1177669 command_runner.go:130] > Device: 0,72	Inode: 804         Links: 1
	I1218 00:21:13.134460 1177669 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:13.134465 1177669 command_runner.go:130] > Access: 2025-12-18 00:21:13.087402358 +0000
	I1218 00:21:13.134471 1177669 command_runner.go:130] > Modify: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134475 1177669 command_runner.go:130] > Change: 2025-12-18 00:21:12.995405346 +0000
	I1218 00:21:13.134479 1177669 command_runner.go:130] >  Birth: -
	I1218 00:21:13.134836 1177669 start.go:564] Will wait 60s for crictl version
	I1218 00:21:13.134895 1177669 ssh_runner.go:195] Run: which crictl
	I1218 00:21:13.138647 1177669 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:21:13.138725 1177669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:21:13.167266 1177669 command_runner.go:130] > Version:  0.1.0
	I1218 00:21:13.167284 1177669 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:21:13.167289 1177669 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:21:13.167294 1177669 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:21:13.169251 1177669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:21:13.169347 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.194596 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.194618 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.194624 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.194629 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.194634 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.194639 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.194643 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.194656 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.194660 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.194671 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.194674 1177669 command_runner.go:130] >      static
	I1218 00:21:13.194678 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.194682 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.194686 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.194689 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.194693 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.194697 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.194701 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.194705 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.194709 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.196349 1177669 ssh_runner.go:195] Run: crio --version
	I1218 00:21:13.221274 1177669 command_runner.go:130] > crio version 1.34.3
	I1218 00:21:13.221297 1177669 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:21:13.221302 1177669 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:21:13.221308 1177669 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:21:13.221313 1177669 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:21:13.221318 1177669 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:21:13.221321 1177669 command_runner.go:130] >    Compiler:       gc
	I1218 00:21:13.221326 1177669 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:21:13.221331 1177669 command_runner.go:130] >    Linkmode:       static
	I1218 00:21:13.221334 1177669 command_runner.go:130] >    BuildTags:
	I1218 00:21:13.221338 1177669 command_runner.go:130] >      static
	I1218 00:21:13.221341 1177669 command_runner.go:130] >      netgo
	I1218 00:21:13.221345 1177669 command_runner.go:130] >      osusergo
	I1218 00:21:13.221350 1177669 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:21:13.221353 1177669 command_runner.go:130] >      seccomp
	I1218 00:21:13.221357 1177669 command_runner.go:130] >      apparmor
	I1218 00:21:13.221360 1177669 command_runner.go:130] >      selinux
	I1218 00:21:13.221364 1177669 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:21:13.221369 1177669 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:21:13.221373 1177669 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:21:13.226046 1177669 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 00:21:13.228983 1177669 cli_runner.go:164] Run: docker network inspect functional-240845 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:21:13.244579 1177669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:21:13.248178 1177669 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:21:13.248440 1177669 kubeadm.go:884] updating cluster {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA API
ServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fal
se DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:21:13.248553 1177669 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:21:13.248613 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.282229 1177669 command_runner.go:130] > {
	I1218 00:21:13.282251 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.282256 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282265 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.282269 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282275 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.282279 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282283 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282294 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.282305 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.282308 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282313 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.282332 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282342 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282346 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282350 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282356 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.282364 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282370 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.282373 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282378 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282389 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.282398 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.282403 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282408 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.282415 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282422 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282426 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282434 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282444 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.282449 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282454 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.282462 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282466 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282475 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.282483 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.282491 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282495 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.282499 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282503 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282506 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282509 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282516 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.282523 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282528 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.282532 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282536 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282549 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.282557 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.282564 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282569 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.282578 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.282586 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282589 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282592 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282599 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.282606 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282611 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.282615 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282624 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282631 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.282643 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.282647 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282651 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.282658 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282661 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282665 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282669 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282676 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282680 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282698 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282709 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.282714 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282719 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.282726 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282729 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282737 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.282746 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.282751 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282755 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.282759 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282765 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282769 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282777 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282782 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282785 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282788 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282795 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.282802 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282807 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.282811 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282815 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282828 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.282836 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.282843 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282850 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.282853 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.282862 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.282865 1177669 command_runner.go:130] >       },
	I1218 00:21:13.282869 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282873 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282883 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282887 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282894 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.282902 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.282907 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.282910 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282913 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.282922 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.282941 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.282952 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.282957 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.282967 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.282970 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.282973 1177669 command_runner.go:130] >     },
	I1218 00:21:13.282976 1177669 command_runner.go:130] >     {
	I1218 00:21:13.282984 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.282999 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283004 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.283007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283010 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283018 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.283026 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.283029 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283036 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.283040 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283046 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.283054 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283061 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283065 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.283067 1177669 command_runner.go:130] >     },
	I1218 00:21:13.283071 1177669 command_runner.go:130] >     {
	I1218 00:21:13.283079 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.283084 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.283089 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.283092 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283099 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.283107 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.283116 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.283122 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.283126 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.283129 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.283133 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.283136 1177669 command_runner.go:130] >       },
	I1218 00:21:13.283144 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.283148 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.283155 1177669 command_runner.go:130] >     }
	I1218 00:21:13.283158 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.283161 1177669 command_runner.go:130] > }
	I1218 00:21:13.283336 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.283347 1177669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:21:13.283410 1177669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:21:13.307800 1177669 command_runner.go:130] > {
	I1218 00:21:13.307819 1177669 command_runner.go:130] >   "images":  [
	I1218 00:21:13.307823 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307831 1177669 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:21:13.307836 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307841 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:21:13.307845 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307849 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307861 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:21:13.307869 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:21:13.307872 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307877 1177669 command_runner.go:130] >       "size":  "111333938",
	I1218 00:21:13.307881 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307886 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307889 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307893 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307899 1177669 command_runner.go:130] >       "id":  "c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13",
	I1218 00:21:13.307903 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307909 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"
	I1218 00:21:13.307912 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307921 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307929 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae",
	I1218 00:21:13.307940 1177669 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"
	I1218 00:21:13.307943 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307947 1177669 command_runner.go:130] >       "size":  "108362109",
	I1218 00:21:13.307951 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.307959 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.307962 1177669 command_runner.go:130] >     },
	I1218 00:21:13.307965 1177669 command_runner.go:130] >     {
	I1218 00:21:13.307971 1177669 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:21:13.307975 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.307980 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:21:13.307983 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.307987 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.307995 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:21:13.308003 1177669 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:21:13.308007 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308011 1177669 command_runner.go:130] >       "size":  "29037500",
	I1218 00:21:13.308015 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308020 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308023 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308026 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308032 1177669 command_runner.go:130] >       "id":  "138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc",
	I1218 00:21:13.308036 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308042 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.12.1"
	I1218 00:21:13.308045 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308049 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308057 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789",
	I1218 00:21:13.308065 1177669 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"
	I1218 00:21:13.308068 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308072 1177669 command_runner.go:130] >       "size":  "73195387",
	I1218 00:21:13.308080 1177669 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:21:13.308084 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308087 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308090 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308099 1177669 command_runner.go:130] >       "id":  "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1218 00:21:13.308103 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308108 1177669 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1218 00:21:13.308111 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308114 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308122 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534",
	I1218 00:21:13.308129 1177669 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"
	I1218 00:21:13.308132 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308136 1177669 command_runner.go:130] >       "size":  "60857170",
	I1218 00:21:13.308140 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308143 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308146 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308149 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308153 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308156 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308159 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308165 1177669 command_runner.go:130] >       "id":  "cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896",
	I1218 00:21:13.308168 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308173 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.34.3"
	I1218 00:21:13.308176 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308180 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308188 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460",
	I1218 00:21:13.308195 1177669 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"
	I1218 00:21:13.308198 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308202 1177669 command_runner.go:130] >       "size":  "84818927",
	I1218 00:21:13.308206 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308210 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308213 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308217 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308241 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308244 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308247 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308253 1177669 command_runner.go:130] >       "id":  "7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22",
	I1218 00:21:13.308262 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308269 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.34.3"
	I1218 00:21:13.308275 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308279 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308287 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1",
	I1218 00:21:13.308295 1177669 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"
	I1218 00:21:13.308298 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308302 1177669 command_runner.go:130] >       "size":  "72629077",
	I1218 00:21:13.308306 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308309 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308312 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308316 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308319 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308323 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308325 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308332 1177669 command_runner.go:130] >       "id":  "4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162",
	I1218 00:21:13.308335 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308340 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.34.3"
	I1218 00:21:13.308343 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308347 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308354 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86",
	I1218 00:21:13.308370 1177669 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"
	I1218 00:21:13.308374 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308377 1177669 command_runner.go:130] >       "size":  "75941783",
	I1218 00:21:13.308381 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308385 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308387 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308390 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308397 1177669 command_runner.go:130] >       "id":  "2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6",
	I1218 00:21:13.308400 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308405 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.34.3"
	I1218 00:21:13.308408 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308412 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308422 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611",
	I1218 00:21:13.308430 1177669 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"
	I1218 00:21:13.308433 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308437 1177669 command_runner.go:130] >       "size":  "51592021",
	I1218 00:21:13.308440 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308444 1177669 command_runner.go:130] >         "value":  "0"
	I1218 00:21:13.308447 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308450 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308455 1177669 command_runner.go:130] >       "pinned":  false
	I1218 00:21:13.308458 1177669 command_runner.go:130] >     },
	I1218 00:21:13.308461 1177669 command_runner.go:130] >     {
	I1218 00:21:13.308468 1177669 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:21:13.308472 1177669 command_runner.go:130] >       "repoTags":  [
	I1218 00:21:13.308477 1177669 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.308480 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308484 1177669 command_runner.go:130] >       "repoDigests":  [
	I1218 00:21:13.308491 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:21:13.308498 1177669 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:21:13.308501 1177669 command_runner.go:130] >       ],
	I1218 00:21:13.308505 1177669 command_runner.go:130] >       "size":  "519884",
	I1218 00:21:13.308508 1177669 command_runner.go:130] >       "uid":  {
	I1218 00:21:13.308512 1177669 command_runner.go:130] >         "value":  "65535"
	I1218 00:21:13.308515 1177669 command_runner.go:130] >       },
	I1218 00:21:13.308518 1177669 command_runner.go:130] >       "username":  "",
	I1218 00:21:13.308522 1177669 command_runner.go:130] >       "pinned":  true
	I1218 00:21:13.308524 1177669 command_runner.go:130] >     }
	I1218 00:21:13.308527 1177669 command_runner.go:130] >   ]
	I1218 00:21:13.308529 1177669 command_runner.go:130] > }
	I1218 00:21:13.310403 1177669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:21:13.310424 1177669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:21:13.310432 1177669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.34.3 crio true true} ...
	I1218 00:21:13.310536 1177669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-240845 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:21:13.310619 1177669 ssh_runner.go:195] Run: crio config
	I1218 00:21:13.358161 1177669 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:21:13.358186 1177669 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:21:13.358194 1177669 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:21:13.358198 1177669 command_runner.go:130] > #
	I1218 00:21:13.358205 1177669 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:21:13.358212 1177669 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:21:13.358218 1177669 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:21:13.358229 1177669 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:21:13.358236 1177669 command_runner.go:130] > # reload'.
	I1218 00:21:13.358243 1177669 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:21:13.358250 1177669 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:21:13.358258 1177669 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:21:13.358264 1177669 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:21:13.358267 1177669 command_runner.go:130] > [crio]
	I1218 00:21:13.358273 1177669 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:21:13.358277 1177669 command_runner.go:130] > # containers images, in this directory.
	I1218 00:21:13.358820 1177669 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:21:13.358837 1177669 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:21:13.359435 1177669 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:21:13.359448 1177669 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:21:13.359935 1177669 command_runner.go:130] > # imagestore = ""
	I1218 00:21:13.359950 1177669 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:21:13.359963 1177669 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:21:13.360646 1177669 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:21:13.360660 1177669 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:21:13.360667 1177669 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:21:13.360961 1177669 command_runner.go:130] > # storage_option = [
	I1218 00:21:13.361308 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.361321 1177669 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:21:13.361334 1177669 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:21:13.361921 1177669 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:21:13.361934 1177669 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:21:13.361949 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:21:13.361954 1177669 command_runner.go:130] > # always happen on a node reboot
	I1218 00:21:13.362559 1177669 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:21:13.362583 1177669 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:21:13.362590 1177669 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:21:13.362595 1177669 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:21:13.363052 1177669 command_runner.go:130] > # version_file_persist = ""
	I1218 00:21:13.363067 1177669 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:21:13.363076 1177669 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:21:13.363680 1177669 command_runner.go:130] > # internal_wipe = true
	I1218 00:21:13.363702 1177669 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:21:13.363709 1177669 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:21:13.364349 1177669 command_runner.go:130] > # internal_repair = true
	I1218 00:21:13.364361 1177669 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:21:13.364368 1177669 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:21:13.364377 1177669 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:21:13.364926 1177669 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:21:13.364942 1177669 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:21:13.364946 1177669 command_runner.go:130] > [crio.api]
	I1218 00:21:13.364951 1177669 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:21:13.365581 1177669 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:21:13.365594 1177669 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:21:13.367685 1177669 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:21:13.367700 1177669 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:21:13.367706 1177669 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:21:13.367710 1177669 command_runner.go:130] > # stream_port = "0"
	I1218 00:21:13.367716 1177669 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:21:13.367723 1177669 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:21:13.367730 1177669 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:21:13.367745 1177669 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:21:13.367752 1177669 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:21:13.367762 1177669 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367766 1177669 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:21:13.367773 1177669 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:21:13.367780 1177669 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:21:13.367784 1177669 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:21:13.367791 1177669 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:21:13.367802 1177669 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:21:13.367808 1177669 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:21:13.367814 1177669 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:21:13.367835 1177669 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367844 1177669 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:21:13.367853 1177669 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:21:13.367861 1177669 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:21:13.367868 1177669 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:21:13.367879 1177669 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:21:13.367883 1177669 command_runner.go:130] > [crio.runtime]
	I1218 00:21:13.367893 1177669 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:21:13.367904 1177669 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:21:13.367916 1177669 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:21:13.367926 1177669 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:21:13.367934 1177669 command_runner.go:130] > # default_ulimits = [
	I1218 00:21:13.367937 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.367950 1177669 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:21:13.367958 1177669 command_runner.go:130] > # no_pivot = false
	I1218 00:21:13.367963 1177669 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:21:13.367974 1177669 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:21:13.367979 1177669 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:21:13.367988 1177669 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:21:13.367994 1177669 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:21:13.368004 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368012 1177669 command_runner.go:130] > # conmon = ""
	I1218 00:21:13.368015 1177669 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:21:13.368023 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:21:13.368026 1177669 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:21:13.368035 1177669 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:21:13.368044 1177669 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:21:13.368051 1177669 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:21:13.368058 1177669 command_runner.go:130] > # conmon_env = [
	I1218 00:21:13.368061 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368070 1177669 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:21:13.368076 1177669 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:21:13.368084 1177669 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:21:13.368089 1177669 command_runner.go:130] > # default_env = [
	I1218 00:21:13.368092 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368098 1177669 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:21:13.368111 1177669 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:21:13.368119 1177669 command_runner.go:130] > # selinux = false
	I1218 00:21:13.368125 1177669 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:21:13.368136 1177669 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:21:13.368144 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368148 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.368159 1177669 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:21:13.368167 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368171 1177669 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:21:13.368178 1177669 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:21:13.368189 1177669 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:21:13.368199 1177669 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:21:13.368206 1177669 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:21:13.368212 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368217 1177669 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:21:13.368256 1177669 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:21:13.368261 1177669 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:21:13.368266 1177669 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:21:13.368280 1177669 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:21:13.368287 1177669 command_runner.go:130] > # blockio parameters.
	I1218 00:21:13.368292 1177669 command_runner.go:130] > # blockio_reload = false
	I1218 00:21:13.368298 1177669 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:21:13.368303 1177669 command_runner.go:130] > # irqbalance daemon.
	I1218 00:21:13.368311 1177669 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:21:13.368320 1177669 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:21:13.368327 1177669 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:21:13.368337 1177669 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:21:13.368347 1177669 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:21:13.368357 1177669 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:21:13.368365 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.368369 1177669 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:21:13.368375 1177669 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:21:13.368382 1177669 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:21:13.368388 1177669 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:21:13.368396 1177669 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:21:13.368402 1177669 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:21:13.368412 1177669 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:21:13.368419 1177669 command_runner.go:130] > # will be added.
	I1218 00:21:13.368423 1177669 command_runner.go:130] > # default_capabilities = [
	I1218 00:21:13.368430 1177669 command_runner.go:130] > # 	"CHOWN",
	I1218 00:21:13.368434 1177669 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:21:13.368442 1177669 command_runner.go:130] > # 	"FSETID",
	I1218 00:21:13.368445 1177669 command_runner.go:130] > # 	"FOWNER",
	I1218 00:21:13.368457 1177669 command_runner.go:130] > # 	"SETGID",
	I1218 00:21:13.368461 1177669 command_runner.go:130] > # 	"SETUID",
	I1218 00:21:13.368479 1177669 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:21:13.368487 1177669 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:21:13.368490 1177669 command_runner.go:130] > # 	"KILL",
	I1218 00:21:13.368494 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368506 1177669 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:21:13.368515 1177669 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:21:13.368524 1177669 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:21:13.368531 1177669 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:21:13.368539 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368542 1177669 command_runner.go:130] > default_sysctls = [
	I1218 00:21:13.368547 1177669 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:21:13.368554 1177669 command_runner.go:130] > ]
	I1218 00:21:13.368563 1177669 command_runner.go:130] > # List of devices on the host that a
	I1218 00:21:13.368570 1177669 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:21:13.368577 1177669 command_runner.go:130] > # allowed_devices = [
	I1218 00:21:13.368580 1177669 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:21:13.368588 1177669 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:21:13.368594 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368603 1177669 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:21:13.368611 1177669 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:21:13.368618 1177669 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:21:13.368624 1177669 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:21:13.368628 1177669 command_runner.go:130] > # additional_devices = [
	I1218 00:21:13.368633 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368639 1177669 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:21:13.368646 1177669 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:21:13.368649 1177669 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:21:13.368653 1177669 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:21:13.368664 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368673 1177669 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:21:13.368683 1177669 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:21:13.368701 1177669 command_runner.go:130] > # Defaults to false.
	I1218 00:21:13.368712 1177669 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:21:13.368719 1177669 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:21:13.368725 1177669 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:21:13.368734 1177669 command_runner.go:130] > # hooks_dir = [
	I1218 00:21:13.368739 1177669 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:21:13.368745 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.368751 1177669 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:21:13.368761 1177669 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:21:13.368770 1177669 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:21:13.368773 1177669 command_runner.go:130] > #
	I1218 00:21:13.368780 1177669 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:21:13.368789 1177669 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:21:13.368795 1177669 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:21:13.368803 1177669 command_runner.go:130] > #
	I1218 00:21:13.368809 1177669 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:21:13.368818 1177669 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:21:13.368829 1177669 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:21:13.368846 1177669 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:21:13.368853 1177669 command_runner.go:130] > #
	I1218 00:21:13.368857 1177669 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:21:13.368866 1177669 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:21:13.368876 1177669 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:21:13.368880 1177669 command_runner.go:130] > # pids_limit = -1
	I1218 00:21:13.368886 1177669 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:21:13.368894 1177669 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:21:13.368904 1177669 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:21:13.368917 1177669 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:21:13.368923 1177669 command_runner.go:130] > # log_size_max = -1
	I1218 00:21:13.368931 1177669 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:21:13.368938 1177669 command_runner.go:130] > # log_to_journald = false
	I1218 00:21:13.368944 1177669 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:21:13.368949 1177669 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:21:13.368959 1177669 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:21:13.368968 1177669 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:21:13.368974 1177669 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:21:13.368981 1177669 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:21:13.368986 1177669 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:21:13.368993 1177669 command_runner.go:130] > # read_only = false
	I1218 00:21:13.369000 1177669 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:21:13.369009 1177669 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:21:13.369017 1177669 command_runner.go:130] > # live configuration reload.
	I1218 00:21:13.369020 1177669 command_runner.go:130] > # log_level = "info"
	I1218 00:21:13.369026 1177669 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:21:13.369031 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.369036 1177669 command_runner.go:130] > # log_filter = ""
	I1218 00:21:13.369043 1177669 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369052 1177669 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:21:13.369056 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369067 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369074 1177669 command_runner.go:130] > # uid_mappings = ""
	I1218 00:21:13.369084 1177669 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:21:13.369093 1177669 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:21:13.369097 1177669 command_runner.go:130] > # separated by comma.
	I1218 00:21:13.369105 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369114 1177669 command_runner.go:130] > # gid_mappings = ""
	I1218 00:21:13.369120 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:21:13.369127 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369139 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369150 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369158 1177669 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:21:13.369165 1177669 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:21:13.369174 1177669 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:21:13.369184 1177669 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:21:13.369192 1177669 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:21:13.369196 1177669 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:21:13.369208 1177669 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:21:13.369218 1177669 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:21:13.369224 1177669 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:21:13.369231 1177669 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:21:13.369238 1177669 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:21:13.369247 1177669 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:21:13.369256 1177669 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:21:13.369261 1177669 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:21:13.369265 1177669 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:21:13.369273 1177669 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:21:13.369279 1177669 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:21:13.369286 1177669 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:21:13.369293 1177669 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:21:13.369301 1177669 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:21:13.369310 1177669 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:21:13.369320 1177669 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:21:13.369326 1177669 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:21:13.369333 1177669 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:21:13.369339 1177669 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:21:13.369347 1177669 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:21:13.369351 1177669 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:21:13.369359 1177669 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:21:13.369363 1177669 command_runner.go:130] > # pinns_path = ""
	I1218 00:21:13.369368 1177669 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:21:13.369378 1177669 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:21:13.369382 1177669 command_runner.go:130] > # enable_criu_support = true
	I1218 00:21:13.369390 1177669 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:21:13.369400 1177669 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:21:13.369407 1177669 command_runner.go:130] > # enable_pod_events = false
	I1218 00:21:13.369414 1177669 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:21:13.369422 1177669 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:21:13.369426 1177669 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:21:13.369431 1177669 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:21:13.369443 1177669 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:21:13.369457 1177669 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:21:13.369465 1177669 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:21:13.369474 1177669 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:21:13.369481 1177669 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:21:13.369486 1177669 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:21:13.369492 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.369499 1177669 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:21:13.369509 1177669 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:21:13.369515 1177669 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:21:13.369521 1177669 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:21:13.369523 1177669 command_runner.go:130] > #
	I1218 00:21:13.369528 1177669 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:21:13.369536 1177669 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:21:13.369540 1177669 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:21:13.369548 1177669 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:21:13.369553 1177669 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:21:13.369561 1177669 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:21:13.369565 1177669 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:21:13.369574 1177669 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:21:13.369577 1177669 command_runner.go:130] > # monitor_env = []
	I1218 00:21:13.369585 1177669 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:21:13.369590 1177669 command_runner.go:130] > # allowed_annotations = []
	I1218 00:21:13.369595 1177669 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:21:13.369599 1177669 command_runner.go:130] > # no_sync_log = false
	I1218 00:21:13.369603 1177669 command_runner.go:130] > # default_annotations = {}
	I1218 00:21:13.369611 1177669 command_runner.go:130] > # stream_websockets = false
	I1218 00:21:13.369614 1177669 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:21:13.369664 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.369673 1177669 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:21:13.369680 1177669 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:21:13.369686 1177669 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:21:13.369697 1177669 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:21:13.369708 1177669 command_runner.go:130] > #   in $PATH.
	I1218 00:21:13.369718 1177669 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:21:13.369728 1177669 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:21:13.369735 1177669 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:21:13.369741 1177669 command_runner.go:130] > #   state.
	I1218 00:21:13.369747 1177669 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:21:13.369753 1177669 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:21:13.369759 1177669 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:21:13.369765 1177669 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:21:13.369774 1177669 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:21:13.369780 1177669 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:21:13.369789 1177669 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:21:13.369795 1177669 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:21:13.369805 1177669 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:21:13.369813 1177669 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:21:13.369820 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:21:13.369831 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:21:13.370100 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:21:13.370120 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:21:13.370129 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:21:13.370143 1177669 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:21:13.370151 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:21:13.370162 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:21:13.370169 1177669 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:21:13.370176 1177669 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:21:13.370187 1177669 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:21:13.370195 1177669 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:21:13.370206 1177669 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:21:13.370213 1177669 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:21:13.370219 1177669 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:21:13.370232 1177669 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:21:13.370239 1177669 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:21:13.370249 1177669 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:21:13.370266 1177669 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:21:13.370271 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:21:13.370283 1177669 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:21:13.370288 1177669 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:21:13.370295 1177669 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:21:13.370305 1177669 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:21:13.370313 1177669 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:21:13.370317 1177669 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:21:13.370329 1177669 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:21:13.370338 1177669 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:21:13.370350 1177669 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:21:13.370357 1177669 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:21:13.370367 1177669 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:21:13.370375 1177669 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:21:13.370388 1177669 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:21:13.370395 1177669 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:21:13.370408 1177669 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:21:13.370420 1177669 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:21:13.370425 1177669 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:21:13.370437 1177669 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:21:13.370445 1177669 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:21:13.370459 1177669 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:21:13.370466 1177669 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:21:13.370473 1177669 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:21:13.370485 1177669 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:21:13.370488 1177669 command_runner.go:130] > #
	I1218 00:21:13.370493 1177669 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:21:13.370496 1177669 command_runner.go:130] > #
	I1218 00:21:13.370506 1177669 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:21:13.370513 1177669 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:21:13.370516 1177669 command_runner.go:130] > #
	I1218 00:21:13.370525 1177669 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:21:13.370537 1177669 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:21:13.370545 1177669 command_runner.go:130] > #
	I1218 00:21:13.370553 1177669 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:21:13.370556 1177669 command_runner.go:130] > # feature.
	I1218 00:21:13.370563 1177669 command_runner.go:130] > #
	I1218 00:21:13.370569 1177669 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:21:13.370576 1177669 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:21:13.370587 1177669 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:21:13.370594 1177669 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:21:13.370600 1177669 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:21:13.370610 1177669 command_runner.go:130] > #
	I1218 00:21:13.370618 1177669 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:21:13.370625 1177669 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:21:13.370628 1177669 command_runner.go:130] > #
	I1218 00:21:13.370638 1177669 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:21:13.370644 1177669 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:21:13.370647 1177669 command_runner.go:130] > #
	I1218 00:21:13.370657 1177669 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:21:13.370664 1177669 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:21:13.370667 1177669 command_runner.go:130] > # limitation.
	I1218 00:21:13.370672 1177669 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:21:13.370680 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:21:13.370684 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.370688 1177669 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:21:13.370695 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.370699 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371091 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371100 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371106 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371111 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371151 1177669 command_runner.go:130] > allowed_annotations = [
	I1218 00:21:13.371159 1177669 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:21:13.371163 1177669 command_runner.go:130] > ]
	I1218 00:21:13.371167 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371172 1177669 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:21:13.371180 1177669 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:21:13.371184 1177669 command_runner.go:130] > runtime_type = ""
	I1218 00:21:13.371188 1177669 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:21:13.371224 1177669 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:21:13.371229 1177669 command_runner.go:130] > runtime_config_path = ""
	I1218 00:21:13.371233 1177669 command_runner.go:130] > container_min_memory = ""
	I1218 00:21:13.371242 1177669 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:21:13.371253 1177669 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:21:13.371257 1177669 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:21:13.371263 1177669 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:21:13.371305 1177669 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:21:13.371314 1177669 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:21:13.371321 1177669 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:21:13.371342 1177669 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:21:13.371388 1177669 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:21:13.371402 1177669 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:21:13.371414 1177669 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:21:13.371421 1177669 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:21:13.371470 1177669 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:21:13.371479 1177669 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:21:13.371490 1177669 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:21:13.371528 1177669 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:21:13.371536 1177669 command_runner.go:130] > # Example:
	I1218 00:21:13.371546 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:21:13.371551 1177669 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:21:13.371556 1177669 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:21:13.371561 1177669 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:21:13.371569 1177669 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:21:13.371573 1177669 command_runner.go:130] > # cpushares = "5"
	I1218 00:21:13.371606 1177669 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:21:13.371613 1177669 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:21:13.371617 1177669 command_runner.go:130] > # cpulimit = "35"
	I1218 00:21:13.371620 1177669 command_runner.go:130] > # Where:
	I1218 00:21:13.371629 1177669 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:21:13.371636 1177669 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:21:13.371647 1177669 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:21:13.371690 1177669 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:21:13.371702 1177669 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:21:13.371713 1177669 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:21:13.371718 1177669 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:21:13.371726 1177669 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:21:13.371777 1177669 command_runner.go:130] > # Default value is set to true
	I1218 00:21:13.371785 1177669 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:21:13.371791 1177669 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:21:13.371796 1177669 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:21:13.371805 1177669 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:21:13.371846 1177669 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:21:13.371855 1177669 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:21:13.371869 1177669 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:21:13.371873 1177669 command_runner.go:130] > # timezone = ""
	I1218 00:21:13.371880 1177669 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:21:13.371883 1177669 command_runner.go:130] > #
	I1218 00:21:13.371923 1177669 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:21:13.371933 1177669 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:21:13.371937 1177669 command_runner.go:130] > [crio.image]
	I1218 00:21:13.371948 1177669 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:21:13.371953 1177669 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:21:13.371960 1177669 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:21:13.372001 1177669 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372008 1177669 command_runner.go:130] > # global_auth_file = ""
	I1218 00:21:13.372014 1177669 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:21:13.372020 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372029 1177669 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:21:13.372036 1177669 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:21:13.372043 1177669 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:21:13.372052 1177669 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:21:13.372057 1177669 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:21:13.372094 1177669 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:21:13.372111 1177669 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:21:13.372119 1177669 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:21:13.372125 1177669 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:21:13.372134 1177669 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:21:13.372140 1177669 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:21:13.372147 1177669 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:21:13.372187 1177669 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:21:13.372197 1177669 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:21:13.372204 1177669 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:21:13.372215 1177669 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:21:13.372270 1177669 command_runner.go:130] > # pinned_images = [
	I1218 00:21:13.372283 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372290 1177669 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:21:13.372301 1177669 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:21:13.372308 1177669 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:21:13.372319 1177669 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:21:13.372324 1177669 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:21:13.372362 1177669 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:21:13.372371 1177669 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:21:13.372384 1177669 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:21:13.372391 1177669 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:21:13.372402 1177669 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:21:13.372408 1177669 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:21:13.372414 1177669 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:21:13.372450 1177669 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:21:13.372460 1177669 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:21:13.372464 1177669 command_runner.go:130] > # changing them here.
	I1218 00:21:13.372475 1177669 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:21:13.372479 1177669 command_runner.go:130] > # insecure_registries = [
	I1218 00:21:13.372482 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372489 1177669 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:21:13.372498 1177669 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:21:13.372502 1177669 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:21:13.372541 1177669 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:21:13.372549 1177669 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:21:13.372559 1177669 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:21:13.372567 1177669 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:21:13.372572 1177669 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:21:13.372582 1177669 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:21:13.372591 1177669 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:21:13.372630 1177669 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:21:13.372638 1177669 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:21:13.372643 1177669 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:21:13.372650 1177669 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:21:13.372667 1177669 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:21:13.372672 1177669 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:21:13.372678 1177669 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:21:13.372721 1177669 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:21:13.372730 1177669 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:21:13.372735 1177669 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:21:13.372746 1177669 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:21:13.372750 1177669 command_runner.go:130] > # CNI plugins.
	I1218 00:21:13.372753 1177669 command_runner.go:130] > [crio.network]
	I1218 00:21:13.372759 1177669 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:21:13.372769 1177669 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:21:13.372773 1177669 command_runner.go:130] > # cni_default_network = ""
	I1218 00:21:13.372780 1177669 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:21:13.372837 1177669 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:21:13.372851 1177669 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:21:13.372856 1177669 command_runner.go:130] > # plugin_dirs = [
	I1218 00:21:13.372860 1177669 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:21:13.372863 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372867 1177669 command_runner.go:130] > # List of included pod metrics.
	I1218 00:21:13.372903 1177669 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:21:13.372909 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.372923 1177669 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:21:13.372927 1177669 command_runner.go:130] > [crio.metrics]
	I1218 00:21:13.372933 1177669 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:21:13.372941 1177669 command_runner.go:130] > # enable_metrics = false
	I1218 00:21:13.372946 1177669 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:21:13.372951 1177669 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:21:13.372958 1177669 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:21:13.372999 1177669 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:21:13.373006 1177669 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:21:13.373010 1177669 command_runner.go:130] > # metrics_collectors = [
	I1218 00:21:13.373018 1177669 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:21:13.373023 1177669 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:21:13.373033 1177669 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:21:13.373037 1177669 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:21:13.373042 1177669 command_runner.go:130] > # 	"operations_total",
	I1218 00:21:13.373077 1177669 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:21:13.373084 1177669 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:21:13.373089 1177669 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:21:13.373093 1177669 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:21:13.373098 1177669 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:21:13.373106 1177669 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:21:13.373111 1177669 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:21:13.373115 1177669 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:21:13.373120 1177669 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:21:13.373133 1177669 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:21:13.373167 1177669 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:21:13.373176 1177669 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:21:13.373179 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.373190 1177669 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:21:13.373199 1177669 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:21:13.373205 1177669 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:21:13.373209 1177669 command_runner.go:130] > # metrics_port = 9090
	I1218 00:21:13.373214 1177669 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:21:13.373222 1177669 command_runner.go:130] > # metrics_socket = ""
	I1218 00:21:13.373425 1177669 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:21:13.373436 1177669 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:21:13.373448 1177669 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:21:13.373454 1177669 command_runner.go:130] > # certificate on any modification event.
	I1218 00:21:13.373457 1177669 command_runner.go:130] > # metrics_cert = ""
	I1218 00:21:13.373463 1177669 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:21:13.373472 1177669 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:21:13.373475 1177669 command_runner.go:130] > # metrics_key = ""
	I1218 00:21:13.373510 1177669 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:21:13.373518 1177669 command_runner.go:130] > [crio.tracing]
	I1218 00:21:13.373528 1177669 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:21:13.373538 1177669 command_runner.go:130] > # enable_tracing = false
	I1218 00:21:13.373545 1177669 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:21:13.373549 1177669 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:21:13.373560 1177669 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:21:13.373565 1177669 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:21:13.373569 1177669 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:21:13.373602 1177669 command_runner.go:130] > [crio.nri]
	I1218 00:21:13.373606 1177669 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:21:13.373614 1177669 command_runner.go:130] > # enable_nri = true
	I1218 00:21:13.373618 1177669 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:21:13.373623 1177669 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:21:13.373628 1177669 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:21:13.373632 1177669 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:21:13.373641 1177669 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:21:13.373646 1177669 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:21:13.373652 1177669 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:21:13.374323 1177669 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:21:13.374347 1177669 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:21:13.374353 1177669 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:21:13.374359 1177669 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:21:13.374369 1177669 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:21:13.374374 1177669 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:21:13.374384 1177669 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:21:13.374396 1177669 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:21:13.374400 1177669 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:21:13.374404 1177669 command_runner.go:130] > # - OCI hook injection
	I1218 00:21:13.374410 1177669 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:21:13.374419 1177669 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:21:13.374424 1177669 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:21:13.374429 1177669 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:21:13.374440 1177669 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:21:13.374447 1177669 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:21:13.374453 1177669 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:21:13.374461 1177669 command_runner.go:130] > #
	I1218 00:21:13.374470 1177669 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:21:13.374475 1177669 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:21:13.374481 1177669 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:21:13.374487 1177669 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:21:13.374497 1177669 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:21:13.374503 1177669 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:21:13.374508 1177669 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:21:13.374517 1177669 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:21:13.374520 1177669 command_runner.go:130] > # ]
	I1218 00:21:13.374526 1177669 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:21:13.374532 1177669 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:21:13.374540 1177669 command_runner.go:130] > [crio.stats]
	I1218 00:21:13.374546 1177669 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:21:13.374552 1177669 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:21:13.374557 1177669 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:21:13.374567 1177669 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:21:13.374574 1177669 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:21:13.374578 1177669 command_runner.go:130] > # collection_period = 0
	I1218 00:21:13.375235 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337716712Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:21:13.375252 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337755529Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:21:13.375261 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337787676Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:21:13.375269 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337813217Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:21:13.375279 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.337887603Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:21:13.375295 1177669 command_runner.go:130] ! time="2025-12-18T00:21:13.338323059Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:21:13.375307 1177669 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:21:13.375636 1177669 cni.go:84] Creating CNI manager for ""
	I1218 00:21:13.375654 1177669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:21:13.375670 1177669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:21:13.375692 1177669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-240845 NodeName:functional-240845 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:21:13.375818 1177669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-240845"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:21:13.375897 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 00:21:13.382943 1177669 command_runner.go:130] > kubeadm
	I1218 00:21:13.382987 1177669 command_runner.go:130] > kubectl
	I1218 00:21:13.382992 1177669 command_runner.go:130] > kubelet
	I1218 00:21:13.383228 1177669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:21:13.383323 1177669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:21:13.390563 1177669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (367 bytes)
	I1218 00:21:13.402469 1177669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 00:21:13.415695 1177669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2214 bytes)
	I1218 00:21:13.427935 1177669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:21:13.431432 1177669 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:21:13.431528 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:13.573724 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:13.587283 1177669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845 for IP: 192.168.49.2
	I1218 00:21:13.587308 1177669 certs.go:195] generating shared ca certs ...
	I1218 00:21:13.587325 1177669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:13.587468 1177669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:21:13.587523 1177669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:21:13.587535 1177669 certs.go:257] generating profile certs ...
	I1218 00:21:13.587627 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key
	I1218 00:21:13.587682 1177669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key.83c30509
	I1218 00:21:13.587749 1177669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key
	I1218 00:21:13.587763 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:21:13.587778 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:21:13.587791 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:21:13.587807 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:21:13.587827 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:21:13.587840 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:21:13.587855 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:21:13.587866 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:21:13.587928 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:21:13.587965 1177669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:21:13.587976 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:21:13.588004 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:21:13.588031 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:21:13.588058 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:21:13.588108 1177669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:21:13.588142 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.588156 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.588167 1177669 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.588757 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:21:13.607287 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:21:13.626005 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:21:13.643497 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:21:13.660653 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:21:13.677616 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1218 00:21:13.694313 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:21:13.711161 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 00:21:13.728011 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:21:13.745006 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:21:13.761771 1177669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:21:13.778664 1177669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:21:13.791171 1177669 ssh_runner.go:195] Run: openssl version
	I1218 00:21:13.796833 1177669 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:21:13.797285 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.804618 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:21:13.812913 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816610 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816655 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:18 /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.816704 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:21:13.857240 1177669 command_runner.go:130] > 51391683
	I1218 00:21:13.857318 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:21:13.864756 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.871981 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:21:13.879459 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883023 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883055 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:18 /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.883126 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:21:13.923479 1177669 command_runner.go:130] > 3ec20f2e
	I1218 00:21:13.923967 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:21:13.931505 1177669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.938743 1177669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:21:13.946369 1177669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950234 1177669 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950276 1177669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.950327 1177669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:21:13.990419 1177669 command_runner.go:130] > b5213941
	I1218 00:21:13.990837 1177669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:21:13.998401 1177669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003376 1177669 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:21:14.003402 1177669 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:21:14.003409 1177669 command_runner.go:130] > Device: 259,1	Inode: 1327743     Links: 1
	I1218 00:21:14.003416 1177669 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:21:14.003422 1177669 command_runner.go:130] > Access: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003427 1177669 command_runner.go:130] > Modify: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003432 1177669 command_runner.go:130] > Change: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003438 1177669 command_runner.go:130] >  Birth: 2025-12-18 00:18:58.627802303 +0000
	I1218 00:21:14.003512 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:21:14.045243 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.045691 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:21:14.086658 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.086738 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:21:14.127897 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.128372 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:21:14.168626 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.169131 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:21:14.209194 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.209712 1177669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:21:14.250333 1177669 command_runner.go:130] > Certificate will not expire
	I1218 00:21:14.250470 1177669 kubeadm.go:401] StartCluster: {Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISer
verNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false
DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:21:14.250558 1177669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:21:14.250623 1177669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:21:14.277777 1177669 command_runner.go:130] > e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563
	I1218 00:21:14.277803 1177669 command_runner.go:130] > 0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c
	I1218 00:21:14.277811 1177669 command_runner.go:130] > 95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e
	I1218 00:21:14.277820 1177669 command_runner.go:130] > 1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5
	I1218 00:21:14.277826 1177669 command_runner.go:130] > 9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1
	I1218 00:21:14.277832 1177669 command_runner.go:130] > 3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad
	I1218 00:21:14.277838 1177669 command_runner.go:130] > cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15
	I1218 00:21:14.277846 1177669 command_runner.go:130] > 38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68
	I1218 00:21:14.277857 1177669 command_runner.go:130] > 1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b
	I1218 00:21:14.277868 1177669 command_runner.go:130] > 61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a
	I1218 00:21:14.277874 1177669 command_runner.go:130] > 98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe
	I1218 00:21:14.277883 1177669 command_runner.go:130] > 891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b
	I1218 00:21:14.277889 1177669 command_runner.go:130] > 2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de
	I1218 00:21:14.277898 1177669 command_runner.go:130] > b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06
	I1218 00:21:14.280281 1177669 cri.go:89] found id: "e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563"
	I1218 00:21:14.280303 1177669 cri.go:89] found id: "0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c"
	I1218 00:21:14.280308 1177669 cri.go:89] found id: "95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e"
	I1218 00:21:14.280312 1177669 cri.go:89] found id: "1c26d35ef1ddb9861fb11e7012a5b7291519d6b8a07ba6b5be725c172ba872e5"
	I1218 00:21:14.280315 1177669 cri.go:89] found id: "9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1"
	I1218 00:21:14.280319 1177669 cri.go:89] found id: "3fc162f056d9a283744eefe7fcd141609ed138d5c7fc0974fadef1e3b4e0e1ad"
	I1218 00:21:14.280323 1177669 cri.go:89] found id: "cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15"
	I1218 00:21:14.280326 1177669 cri.go:89] found id: "38bf939d8b0354250e78584abdddf8bbbed831e6e5ea99d61a6f253d21a10f68"
	I1218 00:21:14.280329 1177669 cri.go:89] found id: "1efae5a52dcfa095ed4190b749aa70c8481bc20ef3d722e7a1f0929aff74b39b"
	I1218 00:21:14.280337 1177669 cri.go:89] found id: "61468203ccb0a6f7599c6be9702525af6119be3ae46ddc18022384f43b62543a"
	I1218 00:21:14.280343 1177669 cri.go:89] found id: "98c5047a268da384edf25411848ef8e4176861aa65095361e7c269446f69d9fe"
	I1218 00:21:14.280347 1177669 cri.go:89] found id: "891e79b326ed49bff724a0e49e97256d5a80c477da8afd5b6bb5a90ab82ec53b"
	I1218 00:21:14.280355 1177669 cri.go:89] found id: "2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	I1218 00:21:14.280359 1177669 cri.go:89] found id: "b97ba93c0f7ab7e222f3d8b8a7350deb8801d0b0bd76dc4dea58d58990aa0b06"
	I1218 00:21:14.280362 1177669 cri.go:89] found id: ""
	I1218 00:21:14.280415 1177669 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 00:21:14.291297 1177669 command_runner.go:130] ! time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	W1218 00:21:14.291357 1177669 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:21:14Z" level=error msg="open /run/runc: no such file or directory"
	I1218 00:21:14.291439 1177669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:21:14.298396 1177669 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:21:14.298416 1177669 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:21:14.298422 1177669 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:21:14.298426 1177669 command_runner.go:130] > member
	I1218 00:21:14.299333 1177669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:21:14.299377 1177669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:21:14.299453 1177669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:21:14.306750 1177669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:21:14.307329 1177669 kubeconfig.go:125] found "functional-240845" server: "https://192.168.49.2:8441"
	I1218 00:21:14.308688 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.308922 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.310273 1177669 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:21:14.310295 1177669 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:21:14.310301 1177669 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:21:14.310306 1177669 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:21:14.310311 1177669 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:21:14.310598 1177669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:21:14.310964 1177669 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:21:14.321005 1177669 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:21:14.321038 1177669 kubeadm.go:602] duration metric: took 21.641512ms to restartPrimaryControlPlane
	I1218 00:21:14.321068 1177669 kubeadm.go:403] duration metric: took 70.601924ms to StartCluster
	I1218 00:21:14.321095 1177669 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.321175 1177669 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.321832 1177669 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:21:14.322054 1177669 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:21:14.322232 1177669 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:21:14.322270 1177669 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:21:14.322334 1177669 addons.go:70] Setting storage-provisioner=true in profile "functional-240845"
	I1218 00:21:14.322346 1177669 addons.go:239] Setting addon storage-provisioner=true in "functional-240845"
	W1218 00:21:14.322351 1177669 addons.go:248] addon storage-provisioner should already be in state true
	I1218 00:21:14.322373 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.322797 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.323222 1177669 addons.go:70] Setting default-storageclass=true in profile "functional-240845"
	I1218 00:21:14.323243 1177669 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-240845"
	I1218 00:21:14.323528 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.326222 1177669 out.go:179] * Verifying Kubernetes components...
	I1218 00:21:14.329298 1177669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:21:14.352407 1177669 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:21:14.352567 1177669 kapi.go:59] client config for functional-240845: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:21:14.353875 1177669 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:21:14.354521 1177669 addons.go:239] Setting addon default-storageclass=true in "functional-240845"
	W1218 00:21:14.354541 1177669 addons.go:248] addon default-storageclass should already be in state true
	I1218 00:21:14.354568 1177669 host.go:66] Checking if "functional-240845" exists ...
	I1218 00:21:14.355010 1177669 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
	I1218 00:21:14.357054 1177669 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.357084 1177669 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:21:14.357149 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.385892 1177669 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.385914 1177669 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:21:14.385974 1177669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
	I1218 00:21:14.412313 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.438252 1177669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
	I1218 00:21:14.538332 1177669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:21:14.555412 1177669 node_ready.go:35] waiting up to 6m0s for node "functional-240845" to be "Ready" ...
	I1218 00:21:14.556627 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.558515 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:14.558665 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:14.559006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:14.569919 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:14.635955 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.636102 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.636146 1177669 retry.go:31] will retry after 274.076226ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.646979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.650760 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.650797 1177669 retry.go:31] will retry after 360.821893ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.911221 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:14.974464 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:14.974555 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:14.974595 1177669 retry.go:31] will retry after 225.739861ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.012854 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.055958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.056036 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.056342 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.079682 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.079793 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.079817 1177669 retry.go:31] will retry after 552.403697ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.200970 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.261673 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.261728 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.261746 1177669 retry.go:31] will retry after 669.780864ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.556091 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:15.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:15.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:15.632797 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:15.699577 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.699638 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.699664 1177669 retry.go:31] will retry after 634.295794ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.931763 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:15.990067 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:15.993514 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:15.993545 1177669 retry.go:31] will retry after 1.113615509s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.055858 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:16.334650 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:16.392078 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:16.395777 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.395856 1177669 retry.go:31] will retry after 558.474178ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:16.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:16.556248 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:16.556629 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:16.556701 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:16.955131 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:17.055832 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.055954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.056319 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:17.076617 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.076722 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.076755 1177669 retry.go:31] will retry after 1.676176244s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.108039 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:17.223472 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:17.223571 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.223606 1177669 retry.go:31] will retry after 1.165701868s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:17.556175 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:17.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:17.556607 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.056383 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.056458 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.056745 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.390333 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:18.466841 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.466880 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.466899 1177669 retry.go:31] will retry after 1.475434566s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.556290 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:18.556363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:18.556640 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:18.753095 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:18.817795 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:18.817871 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:18.817893 1177669 retry.go:31] will retry after 1.833170296s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:19.056294 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.056363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:19.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:19.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:19.556536 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:19.556903 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:19.943440 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:20.003817 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.008032 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.008069 1177669 retry.go:31] will retry after 3.979109659s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.056345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.056668 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.556404 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:20.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:20.556792 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:20.652153 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:20.711890 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:20.715639 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:20.715672 1177669 retry.go:31] will retry after 3.637109781s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:21.056958 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.057040 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.057388 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:21.057444 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:21.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:21.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:21.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:22.555927 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:22.556005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:22.556330 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.056025 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.056094 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.056444 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:23.556246 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:23.556345 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:23.556676 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:23.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:23.987349 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:24.051441 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.051487 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.051524 1177669 retry.go:31] will retry after 5.3171516s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.056654 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.056732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.057111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:24.353838 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:24.413422 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:24.413469 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.413487 1177669 retry.go:31] will retry after 3.340127313s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:24.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:24.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:24.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:25.555854 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:25.555928 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:25.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:26.056042 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.056124 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.056522 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:26.056585 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:26.556332 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:26.556411 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:26.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.056517 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.056589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.056942 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.555642 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:27.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:27.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:27.754507 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:27.812979 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:27.813026 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:27.813045 1177669 retry.go:31] will retry after 6.95951013s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:28.056456 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.056550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:28.057006 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:28.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:28.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:28.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.055872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:29.368874 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:29.425933 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:29.429391 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.429423 1177669 retry.go:31] will retry after 6.711424265s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:29.555717 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:29.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:29.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.055742 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.055823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:30.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:30.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:30.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:30.556179 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:31.055879 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.055958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.056290 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:31.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:31.556084 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:31.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.056199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:32.555959 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:32.556028 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:32.556363 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:32.556413 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:33.055772 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.055844 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.056367 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:33.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:33.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:33.556178 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.055882 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.055955 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:34.556326 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:34.556397 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:34.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:34.556796 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:34.773144 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:34.829958 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:34.833899 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:34.833929 1177669 retry.go:31] will retry after 8.542321591s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:35.056329 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.056407 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.056770 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:35.556516 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:35.556605 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:35.556959 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.057369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.057701 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:36.141963 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:36.202438 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:36.202477 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.202496 1177669 retry.go:31] will retry after 7.758270018s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:36.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:36.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:36.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:37.055818 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.055893 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:37.056274 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:37.555746 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:37.555822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:37.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.055812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.056254 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:38.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:38.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:38.556149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:39.055837 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.056297 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:39.056352 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:39.556245 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:39.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:39.556663 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.056509 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.056606 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.056917 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:40.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:40.555706 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:40.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.055770 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.056183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:41.555731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:41.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:41.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:41.556201 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:42.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.055854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:42.556028 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:42.556119 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:42.556476 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.056276 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.056351 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.056698 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:43.377156 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:21:43.435792 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:43.439377 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.439408 1177669 retry.go:31] will retry after 18.255208537s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:43.556665 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:43.556738 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:43.557098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:43.557163 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:43.961544 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:21:44.047619 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:21:44.047656 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.047681 1177669 retry.go:31] will retry after 16.124184127s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:21:44.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.055890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.056245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:44.556158 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:44.556259 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:44.556606 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:45.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.057068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1218 00:21:45.555737 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:45.555812 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:45.556144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:46.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:46.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:46.555906 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:46.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:46.556364 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.055801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:47.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:47.555784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:47.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.056189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:48.056258 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:48.555948 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:48.556021 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:48.556370 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.056069 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.056148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.056513 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:49.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:49.556531 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:49.556886 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:50.056538 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.056619 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.056964 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:50.057018 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:50.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:50.555766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:50.556116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.055815 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.055884 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.056198 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:51.555724 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:51.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:51.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:52.555734 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:52.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:52.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:52.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:53.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.056005 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.056346 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:53.556049 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:53.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:53.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.056030 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.056102 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.056454 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:54.556467 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:54.556542 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:54.556870 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:54.556927 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:55.055618 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.055704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.056046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:55.555616 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:55.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:55.555993 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.055712 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:56.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:56.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:56.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:57.055721 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.056158 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:57.056210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:21:57.555892 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:57.555965 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:57.556299 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.055745 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.055819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:58.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:58.555793 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:58.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:21:59.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:21:59.555877 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:21:59.556239 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:21:59.556292 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:00.055898 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.055985 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.056349 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:00.172859 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:00.349113 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:00.349165 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.349188 1177669 retry.go:31] will retry after 15.178958797s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:00.556482 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:00.556554 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:00.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.056710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.057020 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.555743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:01.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:01.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:01.695619 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:01.764253 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:01.768251 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:01.768286 1177669 retry.go:31] will retry after 20.261734519s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:02.055637 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.055714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.056058 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:02.056113 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:02.555751 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:02.555820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:02.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:03.555659 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:03.555732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:03.556022 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:04.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.056080 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:04.056144 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:04.556235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:04.556331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:04.556662 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.056450 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.056522 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.056859 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:05.556627 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:05.556731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:05.557039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:06.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:06.056174 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:06.555712 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:06.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:06.556120 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.056150 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:07.555911 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:07.555990 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:07.556341 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:08.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.055811 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.056155 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:08.056203 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:08.555901 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:08.555978 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:08.556327 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:09.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:09.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:09.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:10.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.055797 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:10.056287 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:10.555954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:10.556049 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:10.556390 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:11.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:11.555929 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:11.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.056135 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:12.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:12.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:12.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:12.556162 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:13.055804 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.056174 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:13.555873 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:13.555943 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:13.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.056098 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.056468 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:14.556454 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:14.556529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:14.556828 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:14.556880 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:15.056592 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.056660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.057019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.528571 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:15.555957 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:15.556023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:15.556331 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:15.591509 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:15.594869 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:15.594900 1177669 retry.go:31] will retry after 30.932709272s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:16.056512 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.056582 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.056902 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:16.555621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:16.555718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:16.556051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:17.055743 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.055818 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.056124 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:17.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:17.555739 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:17.555834 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:17.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.055914 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.055987 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.056365 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:18.556060 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:18.556132 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:18.556480 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:19.056235 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.056623 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:19.056697 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:19.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:19.556660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:19.556999 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.056621 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.056700 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.057051 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:20.555764 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:20.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:20.556195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.055711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:21.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:21.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:21.556142 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:21.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:22.030818 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:22:22.056348 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.056446 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.056751 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:22.091069 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:22.094766 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.094798 1177669 retry.go:31] will retry after 47.715756714s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:22:22.556459 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:22.556528 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:22.556883 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.056081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:23.555699 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:23.555792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:23.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:24.055868 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.055942 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.056302 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:24.056357 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:24.556255 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:24.556349 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:24.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.056263 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.056721 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:25.556539 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:25.556623 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:25.557021 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.055767 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.055851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:26.555952 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:26.556027 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:26.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:26.556419 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:27.056075 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:27.556423 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:27.556518 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:27.556857 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.056638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.057038 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:28.555732 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:28.555814 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:28.556167 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:29.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:29.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:29.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:29.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:29.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.055846 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.055931 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.056335 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:30.556057 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:30.556129 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:30.556500 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:31.056282 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.056362 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.056704 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:31.056761 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:31.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:31.556564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:31.556896 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.055712 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:32.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:32.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:32.556248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.055753 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.055824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:33.556054 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:33.556161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:33.556684 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:33.556740 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:34.056460 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.056532 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.056854 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:34.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:34.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:34.556213 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:35.555794 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:35.555889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:35.556242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:36.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.056089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:36.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:36.555705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:36.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:36.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.055826 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.056241 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:37.555683 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:37.555756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:37.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:38.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.056147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:38.056206 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:38.555776 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:38.555872 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:38.556214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.056109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:39.555711 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:39.555807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:39.556166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:40.055954 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.056034 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.056481 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:40.056546 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:40.556379 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:40.556469 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:40.556814 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.056496 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.056800 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:41.556572 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:41.556672 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:41.557008 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.055744 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.055822 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.056248 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:42.555835 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:42.555911 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:42.556276 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:42.556330 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:43.056000 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.056095 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.056464 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:43.556247 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:43.556319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:43.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.056503 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.056852 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:44.556012 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:44.556109 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:44.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:44.556512 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:45.057726 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.057809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.058234 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:45.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:45.556053 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:45.556425 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.055813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:46.528809 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:22:46.556363 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:46.556430 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:46.556863 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:46.556912 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:46.592076 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592111 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:22:46.592213 1177669 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:22:47.056004 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.056462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:47.556264 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:47.556334 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:47.556652 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.056410 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.056481 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.056790 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:48.556515 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:48.556589 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:48.556921 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:48.556975 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:49.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.055730 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:49.555733 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:49.555821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:49.556169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.055866 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.055935 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:50.555707 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:50.555815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:50.556162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:51.056519 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.056627 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:51.057002 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:51.555636 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:51.555709 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:51.556029 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.055820 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:52.555872 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:52.555945 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:52.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.055695 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.055766 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.056179 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:53.555862 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:53.555934 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:53.556315 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:53.556377 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:54.056043 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.056137 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.056494 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:54.556337 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:54.556432 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:54.556763 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.056566 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.056640 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.056965 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:55.555663 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:55.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:55.556084 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:56.056189 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:56.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:56.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:56.556131 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.056558 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.056624 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.056923 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:57.556638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:57.556705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:57.556914 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.055638 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.055992 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:58.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:58.556608 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:58.556858 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:22:58.556906 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:22:59.055615 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.055693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.055962 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:22:59.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:22:59.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:22:59.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.055787 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.055870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.056214 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:00.555725 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:00.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:00.556140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:01.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.056257 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:01.056324 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:01.555994 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:01.556068 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:01.556424 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.056333 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.056677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:02.556470 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:02.556547 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:02.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.055624 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.055721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.056047 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:03.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:03.555833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:03.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:03.556183 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:04.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.056075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:04.556084 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:04.556154 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:04.556510 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.056318 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.056386 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.056739 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:05.556502 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:05.556575 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:05.556888 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:05.556938 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:06.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.056072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:06.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:06.555824 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:06.556163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.056617 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.056703 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.057017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:07.555759 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:07.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:07.556245 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:08.055620 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.055732 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:08.056120 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:08.555828 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:08.555904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:08.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.055992 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.056064 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.056490 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.556279 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:09.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:09.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:09.811086 1177669 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:23:09.870262 1177669 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873844 1177669 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:23:09.873941 1177669 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:23:09.877215 1177669 out.go:179] * Enabled addons: 
	I1218 00:23:09.880843 1177669 addons.go:530] duration metric: took 1m55.558566134s for enable addons: enabled=[]
	I1218 00:23:10.056212 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.056346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.056713 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:10.056767 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:10.556554 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:10.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:10.556967 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.055656 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.055785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:11.555809 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:11.555880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:11.556212 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.055838 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.056185 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:12.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:12.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:12.556050 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:12.556108 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:13.055729 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.055825 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.056171 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:13.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:13.556080 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:13.556462 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.055740 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.055821 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.056182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:14.556315 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:14.556385 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:14.556694 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:14.556741 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:15.056401 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.056470 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.056793 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:15.556411 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:15.556480 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:15.556780 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.056647 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.056963 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:16.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:16.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:16.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:17.055850 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.055925 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.056282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:17.056334 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:17.556007 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:17.556122 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:17.556497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.056286 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.056361 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.056685 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:18.556408 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:18.556476 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:18.556802 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:19.056570 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.056646 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.057040 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:19.057095 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:19.555738 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:19.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:19.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:20.555860 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:20.555958 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:20.556317 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.056081 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.056416 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:21.555830 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:21.555903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:21.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:21.556323 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:22.056015 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.056091 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.056432 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:22.556188 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:22.556285 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:22.556619 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.056387 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.056459 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.056805 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:23.556577 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:23.556649 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:23.556991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:23.557043 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:24.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:24.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:24.556090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:24.556429 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.056237 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.056319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.056651 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:25.556407 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:25.556484 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:25.556804 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:26.056601 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.056678 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.057039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:26.057096 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:26.556349 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:26.556417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:26.556670 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.056432 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.056529 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:27.555635 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:27.555714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:27.556073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.056242 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:28.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:28.556018 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:28.556350 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:28.556398 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:29.056066 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.056141 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.056558 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:29.556410 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:29.556482 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:29.556819 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.056192 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.056304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.056697 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:30.556347 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:30.556425 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:30.556813 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:30.556882 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:31.056651 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.056724 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.057110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:31.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:31.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:31.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:32.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:32.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:33.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:33.056171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:33.556520 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:33.556626 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:33.557595 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.055632 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.055711 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:34.555832 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:34.555907 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:34.556266 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:35.055820 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.055898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.056262 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:35.056317 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:35.555980 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:35.556055 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:35.556475 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.056253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.056329 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.056689 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:36.556253 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:36.556322 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:36.556585 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:37.056343 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.056417 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.056777 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:37.056832 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:37.556557 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:37.556628 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:37.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.055768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:38.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:38.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:38.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.055671 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.056076 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:39.555915 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:39.555994 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:39.556298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:39.556347 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:40.056023 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.056101 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.056458 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:40.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:40.556320 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:40.556648 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.056467 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.056544 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:41.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:41.556710 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:41.556979 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:41.557023 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:42.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.056192 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:42.555713 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:42.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:42.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.055805 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.055880 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.056188 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:43.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:43.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:43.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:44.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.055912 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.056273 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:44.056335 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:44.555920 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:44.555993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:44.556368 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.058236 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.058319 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.058728 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:45.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:45.556602 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:45.556934 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.055649 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.055727 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.056067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:46.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:46.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:46.556151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:46.556209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:47.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.056162 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:47.555674 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:47.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:47.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.055810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:48.556441 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:48.556514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:48.556832 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:48.556892 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:49.056604 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.056674 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.057002 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:49.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:49.555771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.055675 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:50.555687 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:50.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:50.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:51.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:51.056177 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:51.555865 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:51.555937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:51.556310 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:52.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:52.555777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:52.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:53.055816 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.055886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.056231 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:53.056281 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:53.555939 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:53.556010 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:53.556362 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:54.556026 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:54.556097 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:54.556417 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.055678 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.056101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:55.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:55.555734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:55.556067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:55.556129 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:56.055730 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:56.555867 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:56.555946 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:56.556300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.056002 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.056090 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.056457 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:57.556250 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:57.556323 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:57.556654 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:23:57.556712 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:23:58.056487 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.056899 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:58.555623 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:58.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:58.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.055906 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.055982 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.056358 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:23:59.555710 1177669 type.go:168] "Request Body" body=""
	I1218 00:23:59.555803 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:23:59.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:00.059640 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.059720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.060067 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:00.060115 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:00.556240 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:00.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:00.556671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.056422 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.056490 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.056823 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:01.556575 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:01.556648 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:01.556984 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.056108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:02.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:02.555781 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:02.556100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:02.556145 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:03.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:03.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:03.555789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:03.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.056105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:04.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:04.556112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:04.556453 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:04.556502 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:05.056259 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.056331 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.056671 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:05.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:05.556557 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:05.556920 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.055645 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.055718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.056059 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:06.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:06.555753 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:06.556081 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:07.056275 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.056343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.056625 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:07.056668 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:07.556435 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:07.556511 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:07.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.055588 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.055660 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.056003 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:08.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:08.555783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:08.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.055807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.055881 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.056246 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:09.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:09.555809 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:09.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:09.556165 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:10.055865 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.055961 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.056303 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:10.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:10.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:10.556089 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.055834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.055909 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.056300 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:11.556101 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:11.556170 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:11.556519 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:11.556574 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:12.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.056402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.056727 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:12.556537 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:12.556620 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:12.556973 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.055664 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:13.555754 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:13.555826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:13.556183 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:14.055916 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.055993 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.056372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:14.056425 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:14.556306 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:14.556384 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:14.556722 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.056477 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.056546 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.056868 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:15.556641 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:15.556714 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:15.557060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.055691 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.055769 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:16.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:16.555785 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:16.556138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:16.556191 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:17.055685 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.056060 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:17.555722 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:17.555799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:17.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.056016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:18.555625 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:18.555699 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:18.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:19.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.055783 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:19.056164 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:19.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:19.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:19.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.055623 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.055701 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.056014 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:20.555727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:20.555801 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:20.556159 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.055681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:21.555682 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:21.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:21.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:21.556151 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:22.056328 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.056394 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.056666 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:22.556365 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:22.556434 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:22.556767 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.056542 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.056618 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.056908 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:23.555630 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:23.555704 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:23.556032 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:24.055719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:24.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:24.556065 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:24.556148 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:24.556518 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.056084 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.056155 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.056512 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:25.556289 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:25.556358 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:25.556691 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:26.056500 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.056600 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.056966 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:26.057019 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:26.555679 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:26.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:26.556107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.055812 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:27.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:27.556038 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:27.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:28.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:28.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:28.556103 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:28.556166 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:29.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:29.555778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:29.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.055831 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:30.555730 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:30.555810 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:30.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:30.556245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:31.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.056143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:31.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:31.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:31.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.055797 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.055867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.056187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:32.555678 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:32.555760 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:32.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:33.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:33.056107 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:33.555768 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:33.555842 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:33.556187 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.055758 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.055833 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.056195 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:34.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:34.556178 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:34.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:35.056326 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.056396 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.056747 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:35.056801 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:35.556586 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:35.556657 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:35.557044 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.055782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.056144 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:36.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:36.555908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:36.556282 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:37.056627 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.057073 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:37.057140 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:37.555781 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:37.555850 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:37.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.055869 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.055947 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.056284 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:38.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:38.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:38.556118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.055844 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.055924 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.056291 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:39.555708 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:39.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:39.556145 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:39.556213 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:40.055683 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:40.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:40.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:40.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.056055 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:41.555718 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:41.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:41.556072 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:42.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:42.056170 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:42.555680 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:42.555751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:42.556013 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.055694 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:43.555675 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:43.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:43.556016 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.055684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.055751 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.056026 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:44.556296 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:44.556369 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:44.556661 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:44.556718 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:45.057279 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.057363 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.057788 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:45.556579 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:45.556652 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:45.556974 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.056085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:46.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:46.555790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:46.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:47.055703 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.055779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.056086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:47.056132 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:47.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:47.555802 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:47.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:48.555807 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:48.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:48.556199 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.055688 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:49.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:49.555758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:49.556101 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:49.556155 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:50.055727 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.055806 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.056148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:50.555821 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:50.555892 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:50.556250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.056032 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.056394 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:51.555704 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:51.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:51.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:51.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:52.055852 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.056308 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:52.555995 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:52.556071 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:52.556442 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.056207 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.056295 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.056620 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:53.556372 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:53.556448 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:53.556772 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:53.556829 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:54.056587 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.056664 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.057045 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:54.555962 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:54.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:54.556382 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.055789 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.056125 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:55.555702 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:55.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:55.556123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:56.055825 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.055904 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.056298 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:56.056356 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:56.556003 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:56.556076 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:56.556406 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.056092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:57.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:57.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:57.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.055693 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:58.555834 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:58.555906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:58.556260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:24:58.556314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:24:59.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.055748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.056069 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:24:59.555874 1177669 type.go:168] "Request Body" body=""
	I1218 00:24:59.555954 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:24:59.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.055976 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.056062 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.056441 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:00.556138 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:00.556250 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:00.556688 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:00.556759 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:01.056530 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.056604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.056936 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:01.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:01.555731 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:01.556090 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.055817 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.056275 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:02.555950 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:02.556022 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:02.556393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:03.056088 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.056161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.056527 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:03.056581 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:03.556314 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:03.556377 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:03.556644 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.056325 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.056398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.056741 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:04.556656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:04.556735 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:04.557093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.055643 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.055715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.056113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:05.556394 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:05.556464 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:05.556836 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:05.556889 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:06.056637 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.056720 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.057085 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:06.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:06.555786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:06.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.055759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.056100 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:07.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:07.555776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:07.556130 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:08.055819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.055889 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.056255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:08.056314 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:08.555978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:08.556052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:08.556389 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.055791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:09.556083 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:09.556157 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:09.556549 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:10.056352 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.056426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.056786 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:10.056843 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:10.556580 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:10.556658 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:10.557006 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.055714 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.056153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:11.555890 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:11.555963 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:11.556324 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.056036 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.056112 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.056497 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:12.556273 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:12.556347 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:12.556677 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:12.556732 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:13.056425 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.056492 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.056825 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:13.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:13.556340 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:13.556615 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.055978 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.056067 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.056541 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:14.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:14.555796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:14.556186 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:15.055759 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.055835 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.056169 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:15.056244 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:15.556041 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:15.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:15.556434 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.056196 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.056293 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.056642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:16.556478 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:16.556550 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:16.556891 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.055661 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.056001 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:17.555698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:17.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:17.556096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:17.556152 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:18.055710 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.056112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:18.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:18.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:18.556127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.055840 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.055916 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:19.555697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:19.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:19.556139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:19.556192 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:20.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.055764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:20.555796 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:20.555870 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:20.556255 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.055952 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.056023 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.056395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:21.556064 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:21.556138 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:21.556504 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:21.556557 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:22.056274 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.056350 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.056700 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:22.556495 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:22.556573 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:22.556915 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.055591 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.055663 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.055991 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:23.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:23.555759 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:23.556070 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:24.055734 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.055816 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.056168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:24.056239 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:24.556073 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:24.556144 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:24.556516 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.055718 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:25.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:25.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:25.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:26.055793 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.055868 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.056210 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:26.056286 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:26.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:26.555782 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:26.556109 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.055829 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.055906 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.056250 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:27.555667 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:27.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:27.556046 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.055698 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.056136 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:28.555715 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:28.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:28.556156 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:28.556210 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:29.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.056149 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:29.555716 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:29.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:29.556133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.056614 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.056950 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:30.555633 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:30.555715 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:30.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:31.055809 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.055895 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.056307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:31.056368 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:31.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:31.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:31.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.055756 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:32.555778 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:32.555851 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:32.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:33.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.056351 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:33.056404 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:33.556040 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:33.556451 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.056004 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.056660 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:34.555656 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:34.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:34.556087 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.055725 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.055796 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.056139 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:35.555845 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:35.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:35.556288 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:35.556349 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:36.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.055777 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:36.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:36.555888 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:36.556208 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.055704 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.055786 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.056088 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:37.555655 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:37.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:37.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:38.055607 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.055689 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.056039 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:38.056098 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:38.555753 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:38.555823 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:38.556168 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.055853 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.056286 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:39.556210 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:39.556315 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:39.556639 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:40.056445 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.056865 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:40.056930 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:40.556620 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:40.556695 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:40.557017 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.056104 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:41.555789 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:41.555863 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:41.556189 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.055774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.056163 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:42.555934 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:42.556009 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:42.556328 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:42.556374 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:43.055717 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.055788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.056127 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:43.555816 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:43.555890 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:43.556292 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.055998 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.056073 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:44.556522 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:44.556595 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:44.556924 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:44.556979 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:45.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.056201 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:45.555719 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:45.555794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:45.556128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.056492 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.056564 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.056876 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:46.556642 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:46.556716 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:46.557036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:46.557089 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:47.055763 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.055839 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:47.555908 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:47.555986 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:47.556307 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.055720 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.055800 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.056123 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:48.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:48.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:48.556093 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:49.055679 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.056114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:49.056169 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:49.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:49.555770 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:49.556121 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.055833 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.055910 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.056293 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:50.555974 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:50.556043 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:50.556372 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:51.056056 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.056128 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.056465 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:51.056513 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:51.556270 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:51.556344 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:51.556681 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.056466 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.056539 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.056895 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:52.555612 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:52.555693 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:52.556206 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.055909 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.055981 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:53.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:53.555780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:53.556114 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:53.556175 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:54.055792 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.056260 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:54.556080 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:54.556156 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:54.556472 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.056095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:55.555651 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:55.555728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:55.556079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:56.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.055778 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.056140 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:56.056196 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:56.555837 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:56.555913 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:56.556263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.055959 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.056033 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.056356 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:57.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:57.555762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:57.556095 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.056107 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:58.555780 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:58.555854 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:58.556190 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:25:58.556260 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:25:59.055926 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.056011 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.056422 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:25:59.556262 1177669 type.go:168] "Request Body" body=""
	I1218 00:25:59.556343 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:25:59.556642 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.059082 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.059161 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.059514 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:00.556494 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:00.556566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:00.556913 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:00.556965 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:01.055634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.055719 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.056034 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:01.555696 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:01.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:01.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.055775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.056122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:02.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:02.555754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:02.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:03.055699 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.056098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:03.056198 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:03.555672 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:03.555748 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:03.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.055716 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.055792 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.056115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:04.556167 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:04.556265 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:04.556617 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:05.056442 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.056514 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.056851 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:05.056907 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:05.556597 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:05.556667 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:05.556997 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.055761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:06.555695 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:06.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:06.556092 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.055659 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.055728 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.056019 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:07.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:07.555741 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:07.556068 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:07.556123 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:08.055665 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.055743 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.056061 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:08.555631 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:08.555705 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:08.556036 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.055707 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.055787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.056117 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:09.555670 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:09.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:09.556065 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:10.055722 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.055794 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:10.056268 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:10.555749 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:10.555819 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:10.556146 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.055855 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.055932 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:11.555688 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:11.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:11.556086 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.055697 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:12.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:12.555798 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:12.556143 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:12.556195 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:13.055686 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.055758 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:13.555806 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:13.555886 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:13.556252 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.055957 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.056026 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.056385 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:14.556524 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:14.556596 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:14.556938 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:14.556992 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:15.055676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.055750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.056091 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:15.555805 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:15.555887 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:15.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.055728 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.055807 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.056133 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:16.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:16.555813 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:16.556141 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:17.055842 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.055915 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.056281 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:17.056342 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:17.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:17.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:17.556099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.055700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.055772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.056118 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:18.555824 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:18.555898 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:18.556258 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.055724 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.055795 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.056128 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:19.555983 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:19.556056 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:19.556402 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:19.556458 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:20.056180 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.056281 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.056653 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:20.556480 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:20.556560 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:20.556887 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.056634 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.056717 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.057043 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:21.555691 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:21.555761 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:21.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:22.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.055734 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.056082 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:22.056135 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:22.555799 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:22.555876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:22.556259 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.055950 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.056030 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.056393 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:23.555649 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:23.555721 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:23.556074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.055687 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.055763 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.056062 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:24.556096 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:24.556167 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:24.556536 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:24.556589 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:25.056122 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.056197 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.056567 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:25.556331 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:25.556402 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:25.556737 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.056537 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.056615 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.056954 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:26.555650 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:26.555725 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:26.556052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:27.055715 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.055790 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:27.056160 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:27.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:27.555829 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:27.556148 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.055860 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.055937 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:28.555996 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:28.556069 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:28.556395 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:29.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.056151 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:29.056209 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:29.555706 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:29.555779 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:29.556098 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.055732 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.055808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.056154 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:30.555681 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:30.555757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:30.556094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:31.055778 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.055856 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.056181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:31.056278 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:31.555669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:31.555744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:31.556071 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.056099 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:32.555701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:32.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:32.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:33.055858 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.055938 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.056301 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:33.056353 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:33.556038 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:33.556116 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:33.556445 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.056214 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.056311 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.056650 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:34.556604 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:34.556677 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:34.557012 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:35.056645 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.056718 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.057052 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:35.057102 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:35.555605 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:35.555680 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:35.556018 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.055752 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.055826 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.056172 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:36.555709 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:36.555787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:36.556126 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.055668 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.055744 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.056096 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:37.555797 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:37.555867 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:37.556203 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:37.556272 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:38.055962 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.056083 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.056495 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:38.556271 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:38.556346 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:38.556695 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.055905 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.055976 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.056296 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:39.556206 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:39.556304 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:39.556740 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:39.556792 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:40.056703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.056787 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.057218 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:40.555692 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:40.555773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:40.556122 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.055690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.055762 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.056097 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:41.555677 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:41.555750 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:41.556075 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:42.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.055860 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.056244 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:42.056301 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:42.555953 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:42.556025 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:42.556420 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.055702 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.055776 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.056138 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:43.555853 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:43.555926 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:43.556305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.055784 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.056132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:44.556039 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:44.556118 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:44.556489 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:44.556541 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:45.055766 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.055855 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.056305 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:45.555676 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:45.555747 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:45.556132 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.055771 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.056116 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:46.555793 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:46.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:46.556247 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:47.055949 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.056052 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.056438 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:47.056488 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:47.556244 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:47.556324 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:47.556696 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.056448 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.056524 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.056853 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:48.556536 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:48.556604 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:48.556871 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:49.056619 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.056692 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.057011 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:49.057072 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:49.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:49.555768 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:49.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.055701 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.055773 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.056094 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:50.555671 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:50.555791 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:50.556115 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.055723 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.055799 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.056134 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:51.555726 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:51.555805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:51.556182 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:51.556259 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:52.055764 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.055849 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.056215 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:52.555689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:52.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:52.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.055682 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.055754 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.056106 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:53.555801 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:53.555871 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:53.556202 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:53.556285 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:54.055931 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.056002 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.056407 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:54.556321 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:54.556398 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:54.557023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.055680 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.055757 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.056102 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:55.555795 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:55.555866 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:55.556181 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:56.055731 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.055805 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.056166 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:56.056245 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:56.555703 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:56.555774 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:56.556153 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.055669 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.055739 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.056064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:57.555700 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:57.555772 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:57.556108 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.055689 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.055767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.056079 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:58.555714 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:58.555788 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:58.556119 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:26:58.556172 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:26:59.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.055903 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.056263 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:26:59.555918 1177669 type.go:168] "Request Body" body=""
	I1218 00:26:59.555995 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:26:59.556304 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.056095 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.056184 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.056542 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:00.556340 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:00.556426 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:00.556768 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:00.556820 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:01.056543 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.056617 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.056937 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:01.555665 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:01.555746 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:01.556287 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.056048 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.056120 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.056471 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:02.556307 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:02.556375 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:02.556686 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:03.056489 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.056566 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.056907 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:03.056959 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:03.556548 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:03.556616 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:03.556947 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.055614 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.055691 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.056023 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:04.556067 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:04.556168 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:04.556530 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.055677 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.055755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.056077 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:05.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:05.555764 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:05.556113 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:05.556171 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:06.055692 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.055765 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.056074 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:06.555690 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:06.555767 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:06.556112 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.055795 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.055875 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.056204 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:07.555673 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:07.555749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:07.556083 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:08.055773 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.055846 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.056205 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:08.056297 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:08.555965 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:08.556039 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:08.556379 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.055748 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.055815 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.056152 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:09.555684 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:09.555755 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:09.556105 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.055705 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.055780 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.056111 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:10.555666 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:10.555745 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:10.556064 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:10.556121 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:11.055789 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.055876 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.056251 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:11.555685 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:11.555775 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:11.556110 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.055662 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.055749 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.055985 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:12.555638 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:12.555713 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:12.556042 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:13.055626 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.055698 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.056031 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:27:13.056081 1177669 node_ready.go:55] error getting node "functional-240845" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-240845": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:27:13.555723 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:13.555808 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:13.556147 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.055831 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.055908 1177669 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-240845" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:27:14.056264 1177669 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:27:14.555819 1177669 type.go:168] "Request Body" body=""
	I1218 00:27:14.555886 1177669 node_ready.go:38] duration metric: took 6m0.000394955s for node "functional-240845" to be "Ready" ...
	I1218 00:27:14.559015 1177669 out.go:203] 
	W1218 00:27:14.562031 1177669 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:27:14.562056 1177669 out.go:285] * 
	W1218 00:27:14.564187 1177669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:27:14.567133 1177669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.859940256Z" level=info msg="Checking image status: docker.io/library/minikube-local-cache-test:functional-240845" id=50cee970-2587-4094-b1a2-ff23f438e8df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.860074193Z" level=info msg="Image docker.io/library/minikube-local-cache-test:functional-240845 not found" id=50cee970-2587-4094-b1a2-ff23f438e8df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.860115291Z" level=info msg="Neither image nor artfiact docker.io/library/minikube-local-cache-test:functional-240845 found" id=50cee970-2587-4094-b1a2-ff23f438e8df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.883495023Z" level=info msg="Checking image status: localhost/library/minikube-local-cache-test:functional-240845" id=f5228d9e-42e3-4a56-a66a-9de44e1159c5 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.883655026Z" level=info msg="Image localhost/library/minikube-local-cache-test:functional-240845 not found" id=f5228d9e-42e3-4a56-a66a-9de44e1159c5 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:25 functional-240845 crio[2318]: time="2025-12-18T00:27:25.883717046Z" level=info msg="Neither image nor artfiact localhost/library/minikube-local-cache-test:functional-240845 found" id=f5228d9e-42e3-4a56-a66a-9de44e1159c5 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.846222226Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=dfdf6c19-4ccf-4476-8a2a-d15b86928295 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.962234333Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=0c43511b-deaa-409e-8891-f106b21b680f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.968806121Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.34.3" id=54131ea5-7906-4ff9-b836-aba473d77fa9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.973003314Z" level=info msg="Creating container: kube-system/kube-apiserver-functional-240845/kube-apiserver" id=f3a96d9e-d6db-4b6f-9fd4-f7c02579fb06 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.973231483Z" level=info msg="Allowed annotations are specified for workload [io.containers.trace-syscall]"
	Dec 18 00:27:26 functional-240845 crio[2318]: time="2025-12-18T00:27:26.977920901Z" level=info msg="createCtr: releasing container name k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1" id=f3a96d9e-d6db-4b6f-9fd4-f7c02579fb06 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.198369007Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=0607fe01-271d-4608-bfc7-17c813d60240 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.198493278Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=0607fe01-271d-4608-bfc7-17c813d60240 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.198527311Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=0607fe01-271d-4608-bfc7-17c813d60240 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.86681905Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=de85b122-38be-4ae2-9043-e286fc87f238 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.86694314Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=de85b122-38be-4ae2-9043-e286fc87f238 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.866976583Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=de85b122-38be-4ae2-9043-e286fc87f238 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.890744668Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=25237d56-0e9c-42bd-8538-8483f52450e6 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.890901808Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=25237d56-0e9c-42bd-8538-8483f52450e6 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.890953064Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=25237d56-0e9c-42bd-8538-8483f52450e6 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.917313928Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=34af6f9f-a186-421e-82fe-12ea692c8fc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.917480183Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=34af6f9f-a186-421e-82fe-12ea692c8fc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:27 functional-240845 crio[2318]: time="2025-12-18T00:27:27.917537453Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=34af6f9f-a186-421e-82fe-12ea692c8fc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:27:28 functional-240845 crio[2318]: time="2025-12-18T00:27:28.476842553Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=3373db7b-8cec-4c7c-919a-c6e5d09ccb87 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE                                                              CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	3051bfe26a7bd       ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6   58 seconds ago      Exited              storage-provisioner       6                   552e688f4b2fb       storage-provisioner                         kube-system
	56af7390805be       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22   2 minutes ago       Exited              kube-controller-manager   5                   d9cddccbc36e9       kube-controller-manager-functional-240845   kube-system
	3df4b23cd1fc9       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   6 minutes ago       Running             kube-proxy                2                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	9b3fcd7bdcddc       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   6 minutes ago       Running             kindnet-cni               2                   2557f167a47ed       kindnet-84qbm                               kube-system
	fb962917a931f       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   6 minutes ago       Running             etcd                      2                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	f6d062f0f43f4       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   6 minutes ago       Running             kube-scheduler            2                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	45ca9ca01a676       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   6 minutes ago       Running             coredns                   2                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	e79c8e6ec8375       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162   7 minutes ago       Exited              kube-proxy                1                   1c6dc623630a1       kube-proxy-kr6r5                            kube-system
	0fe4c80fa2adf       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13   7 minutes ago       Exited              kindnet-cni               1                   2557f167a47ed       kindnet-84qbm                               kube-system
	95d915f37e740       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42   7 minutes ago       Exited              etcd                      1                   a5a173dfbb1db       etcd-functional-240845                      kube-system
	9caeb1dccc679       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6   7 minutes ago       Exited              kube-scheduler            1                   11c96cd77deed       kube-scheduler-functional-240845            kube-system
	cf507cc725a8d       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc   7 minutes ago       Exited              coredns                   1                   cf088629cf160       coredns-66bc5c9577-mrclk                    kube-system
	2b9f193a1520d       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896   8 minutes ago       Exited              kube-apiserver            0                   e04fd252da213       kube-apiserver-functional-240845            kube-system
	
	
	==> coredns [45ca9ca01a676570a0535560af08d4e95f72145d9702ec8b798ce70d833c0356] <==
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> coredns [cf507cc725a8de48b8aa3b3d59cb3ccad6fe2b67e05c8abbf67bcef83279fe15] <==
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] plugin/health: Going into lameduck mode for 5s
	[INFO] 127.0.0.1:42709 - 40478 "HINFO IN 7841480554586397634.8984575394038029725. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043241905s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [95d915f37e7403f1e02b614c65a3ca10eca33f9e2ed9a48d7a4e381583714c5e] <==
	{"level":"info","ts":"2025-12-18T00:19:42.456949Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.459892Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.464260Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:19:42.464361Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:19:42.473581Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:19:42.473686Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:19:42.512923Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2025-12-18T00:19:42.860469Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-12-18T00:19:42.860559Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-12-18T00:19:42.860705Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.862987Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T00:19:42.863085Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.863106Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"info","ts":"2025-12-18T00:19:42.863197Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-12-18T00:19:42.863210Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863328Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863350Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863361Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863401Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T00:19:42.863409Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T00:19:42.863417Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.876941Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-12-18T00:19:42.877021Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T00:19:42.877059Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:19:42.877083Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-240845","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [fb962917a931fb777a305b1b6998e379972e4d38499641f5d582e94ff93708b1] <==
	{"level":"info","ts":"2025-12-18T00:21:16.766111Z","caller":"fileutil/purge.go:49","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2025-12-18T00:21:16.766332Z","caller":"embed/etcd.go:640","msg":"serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.766371Z","caller":"embed/etcd.go:611","msg":"cmux::serve","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-12-18T00:21:16.767189Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1981","msg":"aec36adc501070cc switched to configuration voters=(12593026477526642892)"}
	{"level":"info","ts":"2025-12-18T00:21:16.767293Z","caller":"membership/cluster.go:433","msg":"ignore already added member","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","added-peer-id":"aec36adc501070cc","added-peer-peer-urls":["https://192.168.49.2:2380"],"added-peer-is-learner":false}
	{"level":"info","ts":"2025-12-18T00:21:16.767389Z","caller":"membership/cluster.go:674","msg":"updated cluster version","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","from":"3.6","to":"3.6"}
	{"level":"info","ts":"2025-12-18T00:21:17.452278Z","logger":"raft","caller":"v3@v3.6.0/raft.go:988","msg":"aec36adc501070cc is starting a new election at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452423Z","logger":"raft","caller":"v3@v3.6.0/raft.go:930","msg":"aec36adc501070cc became pre-candidate at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452498Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgPreVoteResp from aec36adc501070cc at term 3"}
	{"level":"info","ts":"2025-12-18T00:21:17.452538Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.452580Z","logger":"raft","caller":"v3@v3.6.0/raft.go:912","msg":"aec36adc501070cc became candidate at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458217Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1077","msg":"aec36adc501070cc received MsgVoteResp from aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458311Z","logger":"raft","caller":"v3@v3.6.0/raft.go:1693","msg":"aec36adc501070cc has received 1 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2025-12-18T00:21:17.458356Z","logger":"raft","caller":"v3@v3.6.0/raft.go:970","msg":"aec36adc501070cc became leader at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.458401Z","logger":"raft","caller":"v3@v3.6.0/node.go:370","msg":"raft.node: aec36adc501070cc elected leader aec36adc501070cc at term 4"}
	{"level":"info","ts":"2025-12-18T00:21:17.464392Z","caller":"etcdserver/server.go:1820","msg":"published local member to cluster through raft","local-member-id":"aec36adc501070cc","local-member-attributes":"{Name:functional-240845 ClientURLs:[https://192.168.49.2:2379]}","cluster-id":"fa54960ea34d58be","publish-timeout":"7s"}
	{"level":"info","ts":"2025-12-18T00:21:17.464576Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.465463Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.467639Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.49.2:2379"}
	{"level":"info","ts":"2025-12-18T00:21:17.469663Z","caller":"embed/serve.go:138","msg":"ready to serve client requests"}
	{"level":"info","ts":"2025-12-18T00:21:17.484256Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2025-12-18T00:21:17.484501Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2025-12-18T00:21:17.485336Z","caller":"v3rpc/grpc.go:52","msg":"etcdserver: failed to register grpc metrics","error":"duplicate metrics collector registration attempted"}
	{"level":"info","ts":"2025-12-18T00:21:17.490008Z","caller":"v3rpc/health.go:63","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2025-12-18T00:21:17.492585Z","caller":"embed/serve.go:283","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	
	
	==> kernel <==
	 00:27:33 up  7:10,  0 user,  load average: 0.15, 0.40, 1.10
	Linux functional-240845 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [0fe4c80fa2adf97b25fb665d02a2f37ba39e4311d31829700c3a864679f2df2c] <==
	I1218 00:19:41.706085       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 00:19:41.706608       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1218 00:19:41.706796       1 main.go:148] setting mtu 1500 for CNI 
	I1218 00:19:41.706849       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 00:19:41.706886       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T00:19:41Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	E1218 00:19:41.884497       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	I1218 00:19:41.884891       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 00:19:41.884912       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 00:19:41.884921       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 00:19:41.885210       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1218 00:19:41.885327       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:19:41.885420       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:19:41.885714       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:19:42.810791       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	
	
	==> kindnet [9b3fcd7bdcddc7326e7d4c50ecf0ebeef85e8ebe52719009cafb599db42b74a4] <==
	E1218 00:23:15.308322       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:23:24.321432       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:23:30.188055       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:23:40.934269       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:23:50.555354       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:11.192172       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:24:18.295989       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:20.463220       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:24:45.334437       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:24:50.163434       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:24:56.363905       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:14.032706       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:25:16.121579       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:25:27.942524       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:25:48.335255       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:25:56.102936       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:05.238670       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:26:10.015965       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:26:40.630863       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1218 00:26:43.534436       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:26:50.773423       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:27:04.308560       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1218 00:27:16.542428       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1218 00:27:26.936089       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	E1218 00:27:27.587398       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: connect: connection refused" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	
	
	==> kube-apiserver [2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de] <==
	W1218 00:19:35.450481       1 logging.go:55] [core] [Channel #135 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450516       1 logging.go:55] [core] [Channel #2 SubChannel #6]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450551       1 logging.go:55] [core] [Channel #7 SubChannel #9]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450585       1 logging.go:55] [core] [Channel #26 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.450620       1 logging.go:55] [core] [Channel #231 SubChannel #233]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451088       1 logging.go:55] [core] [Channel #131 SubChannel #133]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451140       1 logging.go:55] [core] [Channel #255 SubChannel #257]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451176       1 logging.go:55] [core] [Channel #163 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.451218       1 logging.go:55] [core] [Channel #187 SubChannel #189]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	I1218 00:19:35.461000       1 controller.go:128] Shutting down kubernetes service endpoint reconciler
	W1218 00:19:35.463755       1 logging.go:55] [core] [Channel #139 SubChannel #141]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463898       1 logging.go:55] [core] [Channel #175 SubChannel #177]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.463993       1 logging.go:55] [core] [Channel #235 SubChannel #237]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464077       1 logging.go:55] [core] [Channel #143 SubChannel #145]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464158       1 logging.go:55] [core] [Channel #199 SubChannel #201]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464191       1 logging.go:55] [core] [Channel #55 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464387       1 logging.go:55] [core] [Channel #91 SubChannel #93]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464473       1 logging.go:55] [core] [Channel #103 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464539       1 logging.go:55] [core] [Channel #223 SubChannel #225]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464601       1 logging.go:55] [core] [Channel #107 SubChannel #109]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464323       1 logging.go:55] [core] [Channel #155 SubChannel #157]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464353       1 logging.go:55] [core] [Channel #99 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464694       1 logging.go:55] [core] [Channel #227 SubChannel #229]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 00:19:35.464564       1 logging.go:55] [core] [Channel #247 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90] <==
	I1218 00:24:41.593979       1 serving.go:386] Generated self-signed cert in-memory
	I1218 00:24:44.118744       1 controllermanager.go:191] "Starting" version="v1.34.3"
	I1218 00:24:44.118773       1 controllermanager.go:193] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 00:24:44.120180       1 dynamic_cafile_content.go:161] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I1218 00:24:44.120356       1 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I1218 00:24:44.120599       1 secure_serving.go:211] Serving securely on 127.0.0.1:10257
	I1218 00:24:44.120986       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E1218 00:24:54.122993       1 controllermanager.go:245] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.49.2:8441/healthz\": dial tcp 192.168.49.2:8441: connect: connection refused"
	
	
	==> kube-proxy [3df4b23cd1fc91cb6876fab74b357bb139f1ea48223b502c7dd9c80ea84c8387] <==
	I1218 00:21:20.402926       1 server_linux.go:53] "Using iptables proxy"
	I1218 00:21:20.489835       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1218 00:21:20.490690       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:21.505281       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:24.355725       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:29.225897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:37.736765       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:21:52.064415       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:22:27.480420       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:23:19.153551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:24:11.289733       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:05.391565       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:25:37.545166       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:16.451554       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:26:57.580830       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:27:28.661385       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-240845&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	
	
	==> kube-proxy [e79c8e6ec83757eb6aa1b369b50330ee975b30dc2ed088357e1394021a5fb563] <==
	
	
	==> kube-scheduler [9caeb1dccc679b8f926a1548b77377c8835ef4e55de1bb30136660346c408ab1] <==
	I1218 00:19:42.975868       1 serving.go:386] Generated self-signed cert in-memory
	W1218 00:19:43.469835       1 authentication.go:397] Error looking up in-cluster authentication configuration: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": dial tcp 192.168.49.2:8441: connect: connection refused
	W1218 00:19:43.469867       1 authentication.go:398] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1218 00:19:43.469874       1 authentication.go:399] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1218 00:19:43.478778       1 server.go:175] "Starting Kubernetes Scheduler" version="v1.34.3"
	I1218 00:19:43.478807       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	E1218 00:19:43.478827       1 event.go:401] "Unable start event watcher (will not retry!)" err="broadcaster already stopped"
	I1218 00:19:43.480968       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481035       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481365       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	E1218 00:19:43.481433       1 server.go:286] "handlers are not fully synchronized" err="context canceled"
	E1218 00:19:43.481498       1 shared_informer.go:352] "Unable to sync caches" logger="UnhandledError" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481515       1 configmap_cafile_content.go:213] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 00:19:43.481533       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 00:19:43.481546       1 tlsconfig.go:258] "Shutting down DynamicServingCertificateController"
	I1218 00:19:43.481689       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1218 00:19:43.481706       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1218 00:19:43.481710       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1218 00:19:43.481721       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [f6d062f0f43f4922799fb3880d16e341783d4d7d586d7db4a50fb1085ef76e6e] <==
	E1218 00:26:28.967293       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:26:33.949400       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 00:26:35.154591       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:26:40.179897       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:26:42.511252       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1218 00:26:44.107019       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:26:46.364045       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1218 00:26:51.526793       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 00:26:53.205541       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:26:53.282506       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1218 00:26:56.102986       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: Get \"https://192.168.49.2:8441/api/v1/replicationcontrollers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 00:26:58.988588       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 00:27:00.599087       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 00:27:06.721194       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 00:27:09.382892       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 00:27:10.196478       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 00:27:11.331530       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 00:27:17.829842       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 00:27:20.022138       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 00:27:21.018254       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 00:27:21.029006       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 00:27:22.672836       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 00:27:28.255997       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1218 00:27:28.760051       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 00:27:29.678535       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	
	
	==> kubelet <==
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961703    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-mrclk\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="39971787-690f-4cc8-814a-be70de00c6a9" pod="kube-system/coredns-66bc5c9577-mrclk"
	Dec 18 00:27:22 functional-240845 kubelet[1315]: E1218 00:27:22.961846    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kindnet-84qbm\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="046ced09-dec4-43cb-848e-b84560229897" pod="kube-system/kindnet-84qbm"
	Dec 18 00:27:24 functional-240845 kubelet[1315]: E1218 00:27:24.597369    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: I1218 00:27:26.961555    1315 scope.go:117] "RemoveContainer" containerID="2b9f193a1520d250894bbe602dd60223e9eafd2211522be9678f1af2f82fd9de"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: E1218 00:27:26.978201    1315 log.go:32] "CreateContainer in sandbox from runtime service failed" err="rpc error: code = Unknown desc = the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" podSandboxID="e04fd252da21318ab96dfa8b10e5404c17e6ae263ccbb9e9f922d43a78607f1a"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: E1218 00:27:26.978295    1315 kuberuntime_manager.go:1449] "Unhandled Error" err="container kube-apiserver start failed in pod kube-apiserver-functional-240845_kube-system(deb3e5bf338d69244d476364f7618b54): CreateContainerError: the container name \"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use" logger="UnhandledError"
	Dec 18 00:27:26 functional-240845 kubelet[1315]: E1218 00:27:26.978334    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CreateContainerError: \"the container name \\\"k8s_kube-apiserver_kube-apiserver-functional-240845_kube-system_deb3e5bf338d69244d476364f7618b54_1\\\" is already in use by 3425a89dcf045d535c717e8e94cec97297b2d771ba205b203f664ffb23b9206e. You have to remove that container to be able to reuse that name: that name is already in use\"" pod="kube-system/kube-apiserver-functional-240845" podUID="deb3e5bf338d69244d476364f7618b54"
	Dec 18 00:27:27 functional-240845 kubelet[1315]: E1218 00:27:27.260012    1315 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/events/kube-scheduler-functional-240845.18822743ba5c43bb\": dial tcp 192.168.49.2:8441: connect: connection refused" event="&Event{ObjectMeta:{kube-scheduler-functional-240845.18822743ba5c43bb  kube-system    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-scheduler-functional-240845,UID:8e5e0ee0f3cd0bbcd38493dce832a8ff,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://127.0.0.1:10259/readyz\": dial tcp 127.0.0.1:10259: connect: connection refused,Source:EventSource{Component:kubelet,Host:functional-240845,},FirstTimestamp:2025-12-18 00:19:35.725556667 +0000 UTC m=+22.878905798,LastTimestamp:2025-12-18 00:19:36.72607
7626 +0000 UTC m=+23.879426766,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:functional-240845,}"
	Dec 18 00:27:29 functional-240845 kubelet[1315]: I1218 00:27:29.960690    1315 scope.go:117] "RemoveContainer" containerID="56af7390805be22d2f9bd1f9522c7cc930aae81214d915d09f6f52006f4edc90"
	Dec 18 00:27:29 functional-240845 kubelet[1315]: E1218 00:27:29.960839    1315 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=kube-controller-manager-functional-240845_kube-system(6aa5c667ab761331e5a16029bab33485)\"" pod="kube-system/kube-controller-manager-functional-240845" podUID="6aa5c667ab761331e5a16029bab33485"
	Dec 18 00:27:31 functional-240845 kubelet[1315]: E1218 00:27:31.598760    1315 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.49.2:8441/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused" interval="7s"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.552699    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:32Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:32Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:32Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-18T00:27:32Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a\\\",\\\"docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1\\\",\\\"docker.io/kinde
st/kindnetd:v20250512-df8de77b\\\"],\\\"sizeBytes\\\":111333938},{\\\"names\\\":[\\\"docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae\\\",\\\"docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3\\\",\\\"docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88\\\"],\\\"sizeBytes\\\":108362109},{\\\"names\\\":[\\\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\\\",\\\"registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5\\\",\\\"registry.k8s.io/kube-apiserver:v1.34.3\\\"],\\\"sizeBytes\\\":84818927},{\\\"names\\\":[\\\"registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86\\\",\\\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\\\",\\\"registry.k8s.io/kube-proxy:v1.34.3\\\"],\\\"sizeBytes\\\":75941783},{\\\"names
\\\":[\\\"registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789\\\",\\\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\\\",\\\"registry.k8s.io/coredns/coredns:v1.12.1\\\"],\\\"sizeBytes\\\":73195387},{\\\"names\\\":[\\\"registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1\\\",\\\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\\\",\\\"registry.k8s.io/kube-controller-manager:v1.34.3\\\"],\\\"sizeBytes\\\":72629077},{\\\"names\\\":[\\\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\\\",\\\"registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e\\\",\\\"registry.k8s.io/etcd:3.6.5-0\\\"],\\\"sizeBytes\\\":60857170},{\\\"names\\\":[\\\"registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf
0ebc01098ef92965cc371eabcb9611\\\",\\\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\\\",\\\"registry.k8s.io/kube-scheduler:v1.34.3\\\"],\\\"sizeBytes\\\":51592021},{\\\"names\\\":[\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2\\\",\\\"gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944\\\",\\\"gcr.io/k8s-minikube/storage-provisioner:v5\\\"],\\\"sizeBytes\\\":29037500},{\\\"names\\\":[\\\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\\\",\\\"registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f\\\",\\\"registry.k8s.io/pause:3.10.1\\\"],\\\"sizeBytes\\\":519884}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMoun
ts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"functional-240845\": Patch \"https://192.168.49.2:8441/api/v1/nodes/functional-240845/status?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.552960    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.553116    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.553281    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.553455    1315 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"functional-240845\": Get \"https://192.168.49.2:8441/api/v1/nodes/functional-240845?timeout=10s\": dial tcp 192.168.49.2:8441: connect: connection refused"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.553477    1315 kubelet_node_status.go:473] "Unable to update node status" err="update node status exceeds retry count"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.962318    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-mrclk\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="39971787-690f-4cc8-814a-be70de00c6a9" pod="kube-system/coredns-66bc5c9577-mrclk"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.964329    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kindnet-84qbm\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="046ced09-dec4-43cb-848e-b84560229897" pod="kube-system/kindnet-84qbm"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.964830    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/storage-provisioner\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="36dc300a-a099-40d7-874e-e5c2b3795445" pod="kube-system/storage-provisioner"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.965195    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/etcd-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="9257aaeefd3fa4168607b7fbbc0bc32d" pod="kube-system/etcd-functional-240845"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.965579    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-apiserver-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="deb3e5bf338d69244d476364f7618b54" pod="kube-system/kube-apiserver-functional-240845"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.965919    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-controller-manager-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="6aa5c667ab761331e5a16029bab33485" pod="kube-system/kube-controller-manager-functional-240845"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.966240    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-scheduler-functional-240845\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="8e5e0ee0f3cd0bbcd38493dce832a8ff" pod="kube-system/kube-scheduler-functional-240845"
	Dec 18 00:27:32 functional-240845 kubelet[1315]: E1218 00:27:32.966502    1315 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods/kube-proxy-kr6r5\": dial tcp 192.168.49.2:8441: connect: connection refused" podUID="86ad3ff0-4da0-4019-8dc4-c0b794c26b01" pod="kube-system/kube-proxy-kr6r5"
	
	
	==> storage-provisioner [3051bfe26a7bd174b56e8f0a81f1e354e398c53bea0de61d5c0926d2c3821fd0] <==
	I1218 00:26:34.997517       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1218 00:26:34.998962       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-240845 -n functional-240845: exit status 2 (348.402911ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-240845" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctional/serial/MinikubeKubectlCmdDirectly (3.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy (503s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1
E1218 00:29:32.021199 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.396661 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.403146 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.414535 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.435948 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.477351 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.558794 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:19.720456 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:20.042261 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:20.684355 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:21.966076 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:24.528403 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:29.649869 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:33:39.892172 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:34:00.373693 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:34:32.021263 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:34:41.336055 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:35:55.096138 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:36:03.258392 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1: exit status 109 (8m21.568086419s)

                                                
                                                
-- stdout --
	* [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	* Using Docker driver with root privileges
	* Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Found network options:
	  - HTTP_PROXY=localhost:40827
	* Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	* Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Local proxy ignored: not passing HTTP_PROXY=localhost:40827 to docker env.
	! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-288604 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-288604 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000250925s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001193541s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001193541s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:2241: failed minikube start. args "out/minikube-linux-arm64 start -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1": exit status 109
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 6 (309.17646ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 00:37:31.332551 1195496 status.go:458] kubeconfig endpoint: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:248: status error: exit status 6 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                           ARGS                                                                            │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /usr/share/ca-certificates/1159552.pem                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ image          │ functional-240845 image save kicbase/echo-server:functional-240845 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                  │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ image          │ functional-240845 image rm kicbase/echo-server:functional-240845 --alsologtostderr                                                                        │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/11595522.pem                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /usr/share/ca-certificates/11595522.pem                                                                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                  │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/test/nested/copy/1159552/hosts                                                                                        │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image save --daemon kicbase/echo-server:functional-240845 --alsologtostderr                                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format json --alsologtostderr                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh pgrep buildkitd                                                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image          │ functional-240845 image ls --format yaml --alsologtostderr                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format table --alsologtostderr                                                                                               │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format short --alsologtostderr                                                                                               │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete         │ -p functional-240845                                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start          │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:29:09
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:29:09.489057 1189924 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:29:09.489162 1189924 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:29:09.489165 1189924 out.go:374] Setting ErrFile to fd 2...
	I1218 00:29:09.489169 1189924 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:29:09.489533 1189924 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:29:09.490037 1189924 out.go:368] Setting JSON to false
	I1218 00:29:09.490869 1189924 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25898,"bootTime":1765991852,"procs":150,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:29:09.490954 1189924 start.go:143] virtualization:  
	I1218 00:29:09.495462 1189924 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:29:09.500557 1189924 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:29:09.500739 1189924 notify.go:221] Checking for updates...
	I1218 00:29:09.507391 1189924 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:29:09.510604 1189924 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:29:09.513688 1189924 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:29:09.516755 1189924 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:29:09.519742 1189924 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:29:09.523015 1189924 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:29:09.554369 1189924 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:29:09.554497 1189924 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:29:09.622909 1189924 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-18 00:29:09.613053539 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:29:09.623018 1189924 docker.go:319] overlay module found
	I1218 00:29:09.626753 1189924 out.go:179] * Using the docker driver based on user configuration
	I1218 00:29:09.629817 1189924 start.go:309] selected driver: docker
	I1218 00:29:09.629826 1189924 start.go:927] validating driver "docker" against <nil>
	I1218 00:29:09.629837 1189924 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:29:09.630562 1189924 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:29:09.686275 1189924 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-18 00:29:09.677510487 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:29:09.686420 1189924 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 00:29:09.686636 1189924 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:29:09.689687 1189924 out.go:179] * Using Docker driver with root privileges
	I1218 00:29:09.692562 1189924 cni.go:84] Creating CNI manager for ""
	I1218 00:29:09.692612 1189924 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:29:09.692620 1189924 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 00:29:09.692699 1189924 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: S
SHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:29:09.695999 1189924 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:29:09.698874 1189924 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:29:09.701837 1189924 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:29:09.704596 1189924 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:29:09.704632 1189924 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:29:09.704652 1189924 cache.go:65] Caching tarball of preloaded images
	I1218 00:29:09.704651 1189924 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:29:09.704743 1189924 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:29:09.704752 1189924 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:29:09.705111 1189924 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:29:09.705128 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json: {Name:mkb3979857eda1660c7822026db15ae211fd3f58 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:09.723793 1189924 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:29:09.723810 1189924 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:29:09.723824 1189924 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:29:09.723854 1189924 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:29:09.723951 1189924 start.go:364] duration metric: took 84.19µs to acquireMachinesLock for "functional-288604"
	I1218 00:29:09.723978 1189924 start.go:93] Provisioning new machine with config: &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Custom
QemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:29:09.724038 1189924 start.go:125] createHost starting for "" (driver="docker")
	I1218 00:29:09.727403 1189924 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	W1218 00:29:09.727648 1189924 out.go:285] ! Local proxy ignored: not passing HTTP_PROXY=localhost:40827 to docker env.
	I1218 00:29:09.727672 1189924 start.go:159] libmachine.API.Create for "functional-288604" (driver="docker")
	I1218 00:29:09.727689 1189924 client.go:173] LocalClient.Create starting
	I1218 00:29:09.727742 1189924 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem
	I1218 00:29:09.727771 1189924 main.go:143] libmachine: Decoding PEM data...
	I1218 00:29:09.727786 1189924 main.go:143] libmachine: Parsing certificate...
	I1218 00:29:09.727846 1189924 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem
	I1218 00:29:09.727864 1189924 main.go:143] libmachine: Decoding PEM data...
	I1218 00:29:09.727875 1189924 main.go:143] libmachine: Parsing certificate...
	I1218 00:29:09.728247 1189924 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1218 00:29:09.745885 1189924 cli_runner.go:211] docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1218 00:29:09.745969 1189924 network_create.go:284] running [docker network inspect functional-288604] to gather additional debugging logs...
	I1218 00:29:09.745983 1189924 cli_runner.go:164] Run: docker network inspect functional-288604
	W1218 00:29:09.762315 1189924 cli_runner.go:211] docker network inspect functional-288604 returned with exit code 1
	I1218 00:29:09.762335 1189924 network_create.go:287] error running [docker network inspect functional-288604]: docker network inspect functional-288604: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network functional-288604 not found
	I1218 00:29:09.762346 1189924 network_create.go:289] output of [docker network inspect functional-288604]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network functional-288604 not found
	
	** /stderr **
	I1218 00:29:09.762453 1189924 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:29:09.778573 1189924 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x400190cfa0}
	I1218 00:29:09.778615 1189924 network_create.go:124] attempt to create docker network functional-288604 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1218 00:29:09.778670 1189924 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=functional-288604 functional-288604
	I1218 00:29:09.838690 1189924 network_create.go:108] docker network functional-288604 192.168.49.0/24 created
	I1218 00:29:09.838711 1189924 kic.go:121] calculated static IP "192.168.49.2" for the "functional-288604" container
	I1218 00:29:09.838800 1189924 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1218 00:29:09.858869 1189924 cli_runner.go:164] Run: docker volume create functional-288604 --label name.minikube.sigs.k8s.io=functional-288604 --label created_by.minikube.sigs.k8s.io=true
	I1218 00:29:09.876642 1189924 oci.go:103] Successfully created a docker volume functional-288604
	I1218 00:29:09.876728 1189924 cli_runner.go:164] Run: docker run --rm --name functional-288604-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-288604 --entrypoint /usr/bin/test -v functional-288604:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib
	I1218 00:29:10.443753 1189924 oci.go:107] Successfully prepared a docker volume functional-288604
	I1218 00:29:10.443814 1189924 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:29:10.443822 1189924 kic.go:194] Starting extracting preloaded images to volume ...
	I1218 00:29:10.443898 1189924 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v functional-288604:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir
	I1218 00:29:14.292768 1189924 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v functional-288604:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir: (3.848825999s)
	I1218 00:29:14.292795 1189924 kic.go:203] duration metric: took 3.848966105s to extract preloaded images to volume ...
	W1218 00:29:14.292927 1189924 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1218 00:29:14.293027 1189924 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1218 00:29:14.350283 1189924 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname functional-288604 --name functional-288604 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-288604 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=functional-288604 --network functional-288604 --ip 192.168.49.2 --volume functional-288604:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8441 --publish=127.0.0.1::8441 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0
	I1218 00:29:14.645798 1189924 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Running}}
	I1218 00:29:14.665066 1189924 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:29:14.690394 1189924 cli_runner.go:164] Run: docker exec functional-288604 stat /var/lib/dpkg/alternatives/iptables
	I1218 00:29:14.744295 1189924 oci.go:144] the created container "functional-288604" has a running status.
	I1218 00:29:14.744315 1189924 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa...
	I1218 00:29:15.515916 1189924 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1218 00:29:15.535541 1189924 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:29:15.552581 1189924 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1218 00:29:15.552593 1189924 kic_runner.go:114] Args: [docker exec --privileged functional-288604 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1218 00:29:15.595282 1189924 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:29:15.613000 1189924 machine.go:94] provisionDockerMachine start ...
	I1218 00:29:15.613081 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:15.629466 1189924 main.go:143] libmachine: Using SSH client type: native
	I1218 00:29:15.629786 1189924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:29:15.629792 1189924 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:29:15.630339 1189924 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:37654->127.0.0.1:33925: read: connection reset by peer
	I1218 00:29:18.783851 1189924 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:29:18.783866 1189924 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:29:18.783931 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:18.801093 1189924 main.go:143] libmachine: Using SSH client type: native
	I1218 00:29:18.801394 1189924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:29:18.801403 1189924 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:29:18.961109 1189924 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:29:18.961197 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:18.978237 1189924 main.go:143] libmachine: Using SSH client type: native
	I1218 00:29:18.978559 1189924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:29:18.978572 1189924 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:29:19.132217 1189924 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:29:19.132253 1189924 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:29:19.132284 1189924 ubuntu.go:190] setting up certificates
	I1218 00:29:19.132292 1189924 provision.go:84] configureAuth start
	I1218 00:29:19.132352 1189924 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:29:19.149536 1189924 provision.go:143] copyHostCerts
	I1218 00:29:19.149595 1189924 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:29:19.149602 1189924 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:29:19.149676 1189924 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:29:19.149763 1189924 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:29:19.149768 1189924 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:29:19.149792 1189924 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:29:19.149838 1189924 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:29:19.149841 1189924 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:29:19.149861 1189924 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:29:19.149904 1189924 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:29:19.256714 1189924 provision.go:177] copyRemoteCerts
	I1218 00:29:19.256767 1189924 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:29:19.256804 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:19.273403 1189924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:29:19.379596 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:29:19.395955 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:29:19.412928 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:29:19.429670 1189924 provision.go:87] duration metric: took 297.3641ms to configureAuth
	I1218 00:29:19.429687 1189924 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:29:19.429864 1189924 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:29:19.429970 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:19.446964 1189924 main.go:143] libmachine: Using SSH client type: native
	I1218 00:29:19.447255 1189924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:29:19.447268 1189924 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:29:19.744364 1189924 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:29:19.744379 1189924 machine.go:97] duration metric: took 4.131367124s to provisionDockerMachine
	I1218 00:29:19.744404 1189924 client.go:176] duration metric: took 10.016695829s to LocalClient.Create
	I1218 00:29:19.744423 1189924 start.go:167] duration metric: took 10.01675291s to libmachine.API.Create "functional-288604"
	I1218 00:29:19.744430 1189924 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:29:19.744440 1189924 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:29:19.744508 1189924 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:29:19.744545 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:19.762132 1189924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:29:19.867990 1189924 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:29:19.871126 1189924 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:29:19.871143 1189924 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:29:19.871157 1189924 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:29:19.871207 1189924 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:29:19.871291 1189924 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:29:19.871368 1189924 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:29:19.871414 1189924 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:29:19.878959 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:29:19.895806 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:29:19.912446 1189924 start.go:296] duration metric: took 168.00335ms for postStartSetup
	I1218 00:29:19.912813 1189924 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:29:19.929366 1189924 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:29:19.929640 1189924 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:29:19.929679 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:19.946369 1189924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:29:20.061255 1189924 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:29:20.065593 1189924 start.go:128] duration metric: took 10.341541878s to createHost
	I1218 00:29:20.065608 1189924 start.go:83] releasing machines lock for "functional-288604", held for 10.341650445s
	I1218 00:29:20.065680 1189924 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:29:20.090674 1189924 out.go:179] * Found network options:
	I1218 00:29:20.093633 1189924 out.go:179]   - HTTP_PROXY=localhost:40827
	W1218 00:29:20.096647 1189924 out.go:285] ! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	I1218 00:29:20.099606 1189924 out.go:179] * Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	I1218 00:29:20.102614 1189924 ssh_runner.go:195] Run: cat /version.json
	I1218 00:29:20.102667 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:20.102671 1189924 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:29:20.102732 1189924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:29:20.121804 1189924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:29:20.138821 1189924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:29:20.322135 1189924 ssh_runner.go:195] Run: systemctl --version
	I1218 00:29:20.328255 1189924 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:29:20.362683 1189924 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:29:20.366886 1189924 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:29:20.366956 1189924 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:29:20.394064 1189924 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1218 00:29:20.394077 1189924 start.go:496] detecting cgroup driver to use...
	I1218 00:29:20.394107 1189924 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:29:20.394162 1189924 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:29:20.411380 1189924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:29:20.424177 1189924 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:29:20.424301 1189924 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:29:20.441486 1189924 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:29:20.460022 1189924 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:29:20.575869 1189924 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:29:20.702137 1189924 docker.go:234] disabling docker service ...
	I1218 00:29:20.702201 1189924 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:29:20.723880 1189924 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:29:20.737029 1189924 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:29:20.859060 1189924 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:29:20.982759 1189924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:29:20.994980 1189924 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:29:21.011542 1189924 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:29:21.011599 1189924 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.020489 1189924 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:29:21.020553 1189924 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.030803 1189924 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.039646 1189924 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.048394 1189924 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:29:21.056503 1189924 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.065057 1189924 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.078029 1189924 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:29:21.086888 1189924 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:29:21.094535 1189924 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:29:21.101931 1189924 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:29:21.217269 1189924 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:29:21.386566 1189924 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:29:21.386632 1189924 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:29:21.390471 1189924 start.go:564] Will wait 60s for crictl version
	I1218 00:29:21.390524 1189924 ssh_runner.go:195] Run: which crictl
	I1218 00:29:21.393941 1189924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:29:21.420956 1189924 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:29:21.421034 1189924 ssh_runner.go:195] Run: crio --version
	I1218 00:29:21.449629 1189924 ssh_runner.go:195] Run: crio --version
	I1218 00:29:21.480655 1189924 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:29:21.483350 1189924 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:29:21.499487 1189924 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:29:21.503603 1189924 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 00:29:21.513529 1189924 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFir
mwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:29:21.513632 1189924 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:29:21.513684 1189924 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:29:21.554001 1189924 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:29:21.554012 1189924 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:29:21.554065 1189924 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:29:21.579313 1189924 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:29:21.579326 1189924 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:29:21.579332 1189924 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:29:21.579423 1189924 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:29:21.579507 1189924 ssh_runner.go:195] Run: crio config
	I1218 00:29:21.642986 1189924 cni.go:84] Creating CNI manager for ""
	I1218 00:29:21.642998 1189924 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:29:21.643009 1189924 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:29:21.643031 1189924 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath
:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:29:21.643152 1189924 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:29:21.643223 1189924 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:29:21.651136 1189924 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:29:21.651198 1189924 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:29:21.658973 1189924 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:29:21.672353 1189924 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:29:21.685791 1189924 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2219 bytes)
	I1218 00:29:21.698677 1189924 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:29:21.702280 1189924 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 00:29:21.712150 1189924 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:29:21.828449 1189924 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:29:21.845132 1189924 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:29:21.845144 1189924 certs.go:195] generating shared ca certs ...
	I1218 00:29:21.845159 1189924 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:21.845295 1189924 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:29:21.845336 1189924 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:29:21.845342 1189924 certs.go:257] generating profile certs ...
	I1218 00:29:21.845393 1189924 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:29:21.845402 1189924 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt with IP's: []
	I1218 00:29:22.128879 1189924 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt ...
	I1218 00:29:22.128895 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: {Name:mk0ca2a7785e1c4591dc3bcffd87f1bacecbfcab Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:22.129100 1189924 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key ...
	I1218 00:29:22.129106 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key: {Name:mkccb24864106af98d3ed319e7fea28358f7b940 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:22.129201 1189924 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:29:22.129213 1189924 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt.9182ce28 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1218 00:29:22.489995 1189924 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt.9182ce28 ...
	I1218 00:29:22.490011 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt.9182ce28: {Name:mkb8992fd266970e71850b17dde48a2343c1290a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:22.490203 1189924 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28 ...
	I1218 00:29:22.490214 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28: {Name:mk11006d24a846f912ec6f0083f233ea2d8b0988 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:22.490300 1189924 certs.go:382] copying /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt.9182ce28 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt
	I1218 00:29:22.490371 1189924 certs.go:386] copying /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key
	I1218 00:29:22.490429 1189924 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:29:22.490441 1189924 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt with IP's: []
	I1218 00:29:22.737816 1189924 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt ...
	I1218 00:29:22.737830 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt: {Name:mk5e010bfdd015cf85a42a213239d3e7f4d4e4bd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:22.738021 1189924 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key ...
	I1218 00:29:22.738028 1189924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key: {Name:mkd9d2609481dc6e3489e5fcb018dc67de4bfb2b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:29:22.738220 1189924 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:29:22.738264 1189924 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:29:22.738271 1189924 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:29:22.738297 1189924 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:29:22.738327 1189924 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:29:22.738351 1189924 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:29:22.738395 1189924 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:29:22.738966 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:29:22.758364 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:29:22.776585 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:29:22.793976 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:29:22.810685 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:29:22.827653 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:29:22.844740 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:29:22.861016 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:29:22.878218 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:29:22.895443 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:29:22.912331 1189924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:29:22.929040 1189924 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:29:22.941776 1189924 ssh_runner.go:195] Run: openssl version
	I1218 00:29:22.947739 1189924 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:29:22.954726 1189924 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:29:22.961934 1189924 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:29:22.965449 1189924 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:29:22.965516 1189924 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:29:23.007189 1189924 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:29:23.014784 1189924 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1218 00:29:23.022119 1189924 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:29:23.029388 1189924 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:29:23.036670 1189924 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:29:23.040212 1189924 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:29:23.040282 1189924 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:29:23.080877 1189924 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:29:23.088145 1189924 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/1159552.pem /etc/ssl/certs/51391683.0
	I1218 00:29:23.095230 1189924 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:29:23.102360 1189924 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:29:23.109414 1189924 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:29:23.113047 1189924 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:29:23.113108 1189924 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:29:23.155347 1189924 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:29:23.162517 1189924 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/11595522.pem /etc/ssl/certs/3ec20f2e.0
	I1218 00:29:23.169667 1189924 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:29:23.172949 1189924 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1218 00:29:23.172991 1189924 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwa
rePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:29:23.173053 1189924 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:29:23.173123 1189924 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:29:23.198991 1189924 cri.go:89] found id: ""
	I1218 00:29:23.199051 1189924 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:29:23.206435 1189924 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:29:23.213676 1189924 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:29:23.213729 1189924 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:29:23.221540 1189924 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:29:23.221557 1189924 kubeadm.go:158] found existing configuration files:
	
	I1218 00:29:23.221605 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:29:23.228716 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:29:23.228775 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:29:23.235620 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:29:23.242877 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:29:23.242930 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:29:23.249773 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:29:23.257213 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:29:23.257272 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:29:23.264087 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:29:23.271524 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:29:23.271581 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:29:23.278581 1189924 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:29:23.419888 1189924 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:29:23.420320 1189924 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:29:23.489745 1189924 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:33:28.113810 1189924 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1218 00:33:28.113841 1189924 kubeadm.go:319] 
	I1218 00:33:28.113931 1189924 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 00:33:28.114031 1189924 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:33:28.114084 1189924 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:33:28.114219 1189924 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:33:28.114284 1189924 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:33:28.114318 1189924 kubeadm.go:319] OS: Linux
	I1218 00:33:28.114361 1189924 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:33:28.114421 1189924 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:33:28.114502 1189924 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:33:28.114558 1189924 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:33:28.114612 1189924 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:33:28.114668 1189924 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:33:28.114713 1189924 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:33:28.114776 1189924 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:33:28.114841 1189924 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:33:28.114913 1189924 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:33:28.115007 1189924 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:33:28.115097 1189924 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:33:28.115158 1189924 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:33:28.118298 1189924 out.go:252]   - Generating certificates and keys ...
	I1218 00:33:28.118392 1189924 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:33:28.118485 1189924 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:33:28.118562 1189924 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1218 00:33:28.118622 1189924 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1218 00:33:28.118702 1189924 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1218 00:33:28.118768 1189924 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1218 00:33:28.118847 1189924 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1218 00:33:28.118978 1189924 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [functional-288604 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1218 00:33:28.119032 1189924 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1218 00:33:28.119149 1189924 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [functional-288604 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1218 00:33:28.119213 1189924 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1218 00:33:28.119287 1189924 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1218 00:33:28.119346 1189924 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1218 00:33:28.119413 1189924 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:33:28.119476 1189924 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:33:28.119575 1189924 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:33:28.119671 1189924 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:33:28.119743 1189924 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:33:28.119804 1189924 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:33:28.119925 1189924 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:33:28.120001 1189924 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:33:28.125243 1189924 out.go:252]   - Booting up control plane ...
	I1218 00:33:28.125380 1189924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:33:28.125481 1189924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:33:28.125598 1189924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:33:28.125723 1189924 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:33:28.125827 1189924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:33:28.125974 1189924 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:33:28.126066 1189924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:33:28.126105 1189924 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:33:28.126254 1189924 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:33:28.126376 1189924 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:33:28.126439 1189924 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000250925s
	I1218 00:33:28.126442 1189924 kubeadm.go:319] 
	I1218 00:33:28.126498 1189924 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:33:28.126529 1189924 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:33:28.126666 1189924 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:33:28.126670 1189924 kubeadm.go:319] 
	I1218 00:33:28.126788 1189924 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:33:28.126842 1189924 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:33:28.126885 1189924 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	W1218 00:33:28.127031 1189924 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-288604 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-288604 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000250925s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1218 00:33:28.127141 1189924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:33:28.128379 1189924 kubeadm.go:319] 
	I1218 00:33:28.553367 1189924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:33:28.566497 1189924 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:33:28.566550 1189924 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:33:28.574671 1189924 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:33:28.574680 1189924 kubeadm.go:158] found existing configuration files:
	
	I1218 00:33:28.574734 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:33:28.582623 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:33:28.582685 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:33:28.590274 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:33:28.597935 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:33:28.597991 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:33:28.605205 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:33:28.612858 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:33:28.612914 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:33:28.620500 1189924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:33:28.628145 1189924 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:33:28.628201 1189924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:33:28.635558 1189924 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:33:28.674602 1189924 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:33:28.674786 1189924 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:33:28.751005 1189924 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:33:28.751069 1189924 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:33:28.751107 1189924 kubeadm.go:319] OS: Linux
	I1218 00:33:28.751151 1189924 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:33:28.751198 1189924 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:33:28.751244 1189924 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:33:28.751290 1189924 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:33:28.751337 1189924 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:33:28.751383 1189924 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:33:28.751428 1189924 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:33:28.751474 1189924 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:33:28.751519 1189924 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:33:28.824628 1189924 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:33:28.824755 1189924 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:33:28.824865 1189924 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:33:28.832722 1189924 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:33:28.836159 1189924 out.go:252]   - Generating certificates and keys ...
	I1218 00:33:28.836310 1189924 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:33:28.836488 1189924 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:33:28.836568 1189924 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:33:28.836635 1189924 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:33:28.836715 1189924 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:33:28.836790 1189924 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:33:28.836882 1189924 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:33:28.836939 1189924 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:33:28.837043 1189924 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:33:28.837421 1189924 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:33:28.837799 1189924 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:33:28.837853 1189924 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:33:29.075206 1189924 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:33:29.513665 1189924 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:33:29.895992 1189924 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:33:30.279767 1189924 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:33:30.385497 1189924 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:33:30.386154 1189924 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:33:30.388764 1189924 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:33:30.392067 1189924 out.go:252]   - Booting up control plane ...
	I1218 00:33:30.392161 1189924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:33:30.392259 1189924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:33:30.392331 1189924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:33:30.408168 1189924 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:33:30.408283 1189924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:33:30.415983 1189924 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:33:30.416567 1189924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:33:30.416629 1189924 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:33:30.547095 1189924 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:33:30.547201 1189924 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:37:30.546946 1189924 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001193541s
	I1218 00:37:30.546970 1189924 kubeadm.go:319] 
	I1218 00:37:30.547075 1189924 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:37:30.547131 1189924 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:37:30.547463 1189924 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:37:30.547468 1189924 kubeadm.go:319] 
	I1218 00:37:30.547658 1189924 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:37:30.547947 1189924 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:37:30.548002 1189924 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:37:30.548005 1189924 kubeadm.go:319] 
	I1218 00:37:30.552260 1189924 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:37:30.552982 1189924 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:37:30.553316 1189924 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:37:30.553599 1189924 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 00:37:30.553604 1189924 kubeadm.go:319] 
	I1218 00:37:30.553729 1189924 kubeadm.go:403] duration metric: took 8m7.380741231s to StartCluster
	I1218 00:37:30.553758 1189924 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:37:30.553758 1189924 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 00:37:30.553814 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:37:30.582291 1189924 cri.go:89] found id: ""
	I1218 00:37:30.582303 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.582311 1189924 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:37:30.582316 1189924 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:37:30.582371 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:37:30.610309 1189924 cri.go:89] found id: ""
	I1218 00:37:30.610331 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.610338 1189924 logs.go:284] No container was found matching "etcd"
	I1218 00:37:30.610343 1189924 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:37:30.610405 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:37:30.634685 1189924 cri.go:89] found id: ""
	I1218 00:37:30.634699 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.634706 1189924 logs.go:284] No container was found matching "coredns"
	I1218 00:37:30.634711 1189924 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:37:30.634765 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:37:30.668036 1189924 cri.go:89] found id: ""
	I1218 00:37:30.668050 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.668057 1189924 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:37:30.668062 1189924 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:37:30.668132 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:37:30.702013 1189924 cri.go:89] found id: ""
	I1218 00:37:30.702026 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.702033 1189924 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:37:30.702039 1189924 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:37:30.702095 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:37:30.729744 1189924 cri.go:89] found id: ""
	I1218 00:37:30.729758 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.729765 1189924 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:37:30.729770 1189924 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:37:30.729826 1189924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:37:30.759735 1189924 cri.go:89] found id: ""
	I1218 00:37:30.759748 1189924 logs.go:282] 0 containers: []
	W1218 00:37:30.759755 1189924 logs.go:284] No container was found matching "kindnet"
	I1218 00:37:30.759763 1189924 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:37:30.759773 1189924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:37:30.792382 1189924 logs.go:123] Gathering logs for container status ...
	I1218 00:37:30.792402 1189924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:37:30.822676 1189924 logs.go:123] Gathering logs for kubelet ...
	I1218 00:37:30.822692 1189924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:37:30.892254 1189924 logs.go:123] Gathering logs for dmesg ...
	I1218 00:37:30.892273 1189924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:37:30.908270 1189924 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:37:30.908284 1189924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:37:30.981485 1189924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:37:30.973068    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.973816    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.975494    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.976118    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.977685    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:37:30.973068    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.973816    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.975494    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.976118    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:30.977685    4882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	W1218 00:37:30.981508 1189924 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001193541s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 00:37:30.981537 1189924 out.go:285] * 
	W1218 00:37:30.981595 1189924 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001193541s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:37:30.981609 1189924 out.go:285] * 
	W1218 00:37:30.983726 1189924 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:37:30.990349 1189924 out.go:203] 
	W1218 00:37:30.994009 1189924 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001193541s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:37:30.994132 1189924 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 00:37:30.994172 1189924 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 00:37:30.997284 1189924 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381229221Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381262951Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381301063Z" level=info msg="Create NRI interface"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381404272Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381411623Z" level=info msg="runtime interface created"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381421945Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381429715Z" level=info msg="runtime interface starting up..."
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381435188Z" level=info msg="starting plugins..."
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381448086Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:29:21 functional-288604 crio[843]: time="2025-12-18T00:29:21.381510607Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:29:21 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.493105346Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=143f88e7-0960-4e17-b7b9-68dea5869543 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.493870549Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=04ff5d2d-c507-4c94-b06b-394bb773fbbf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.494376872Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=937b72c9-1db4-4347-b94c-1f04a3125688 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.494859277Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=394275e7-5284-4641-b798-9cc93b9df341 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.495390887Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=01683a0c-7f70-470d-b3c6-7d933f2248fc name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.495895626Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=fd22ddec-4ad9-47e1-a1db-b8b7991eaf8e name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:29:23 functional-288604 crio[843]: time="2025-12-18T00:29:23.496452384Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=a2e3556a-345c-48a8-a8f4-bf8df5e2e7ea name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.82786199Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=6ad0d753-0e2c-4b86-a517-ac514f0d778d name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.828584404Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=a1a663ed-ecea-4ed7-8978-76d77cd4b4c2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.82913692Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=be8bffbf-fe85-4628-a9b2-f2bd45d955bd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.829580434Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=6dfcd400-2057-40ae-9f9a-c22e74c1c444 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.829999474Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=a1a4b8b1-3c04-4f65-b35b-86a7b951d9e7 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.830426209Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=7ab8bce0-a68f-4698-9c9f-2ac04c03885d name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:33:28 functional-288604 crio[843]: time="2025-12-18T00:33:28.830867664Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=b4542a62-94bc-4c2d-8677-b381be2b96f3 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:37:31.975997    4989 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:31.976796    4989 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:31.977924    4989 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:31.978514    4989 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:37:31.980247    4989 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:37:32 up  7:19,  0 user,  load average: 0.12, 0.33, 0.81
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:37:29 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:37:29 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 647.
	Dec 18 00:37:29 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:37:29 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:37:29 functional-288604 kubelet[4796]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:37:29 functional-288604 kubelet[4796]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:37:29 functional-288604 kubelet[4796]: E1218 00:37:29.933392    4796 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:37:29 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:37:29 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:37:30 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 648.
	Dec 18 00:37:30 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:37:30 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:37:30 functional-288604 kubelet[4827]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:37:30 functional-288604 kubelet[4827]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:37:30 functional-288604 kubelet[4827]: E1218 00:37:30.706842    4827 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:37:30 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:37:30 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:37:31 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 649.
	Dec 18 00:37:31 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:37:31 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:37:31 functional-288604 kubelet[4901]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:37:31 functional-288604 kubelet[4901]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:37:31 functional-288604 kubelet[4901]: E1218 00:37:31.453210    4901 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:37:31 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:37:31 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 6 (336.768242ms)

                                                
                                                
-- stdout --
	Stopped
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 00:37:32.427232 1195714 status.go:458] kubeconfig endpoint: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:263: status error: exit status 6 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy (503.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart (369.03s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart
I1218 00:37:32.443613 1159552 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-288604 --alsologtostderr -v=8
E1218 00:38:19.390643 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:38:47.100265 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:39:32.021329 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:43:19.390629 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-288604 --alsologtostderr -v=8: exit status 80 (6m5.972401888s)

                                                
                                                
-- stdout --
	* [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:37:32.486183 1195787 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:37:32.486610 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486624 1195787 out.go:374] Setting ErrFile to fd 2...
	I1218 00:37:32.486629 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486918 1195787 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:37:32.487313 1195787 out.go:368] Setting JSON to false
	I1218 00:37:32.488152 1195787 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26401,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:37:32.488255 1195787 start.go:143] virtualization:  
	I1218 00:37:32.491971 1195787 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:37:32.494842 1195787 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:37:32.494944 1195787 notify.go:221] Checking for updates...
	I1218 00:37:32.500434 1195787 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:37:32.503311 1195787 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:32.506071 1195787 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:37:32.508979 1195787 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:37:32.511873 1195787 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:37:32.515326 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:32.515476 1195787 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:37:32.549560 1195787 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:37:32.549709 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.608968 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.600331572 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.609068 1195787 docker.go:319] overlay module found
	I1218 00:37:32.612053 1195787 out.go:179] * Using the docker driver based on existing profile
	I1218 00:37:32.614859 1195787 start.go:309] selected driver: docker
	I1218 00:37:32.614879 1195787 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.614985 1195787 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:37:32.615081 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.681718 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.67244891 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.682130 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:32.682189 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:32.682255 1195787 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.687138 1195787 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:37:32.690134 1195787 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:37:32.693078 1195787 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:37:32.696069 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:32.696123 1195787 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:37:32.696143 1195787 cache.go:65] Caching tarball of preloaded images
	I1218 00:37:32.696183 1195787 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:37:32.696303 1195787 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:37:32.696317 1195787 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:37:32.696417 1195787 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:37:32.714975 1195787 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:37:32.714995 1195787 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:37:32.715013 1195787 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:37:32.715043 1195787 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:37:32.715099 1195787 start.go:364] duration metric: took 33.796µs to acquireMachinesLock for "functional-288604"
	I1218 00:37:32.715121 1195787 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:37:32.715131 1195787 fix.go:54] fixHost starting: 
	I1218 00:37:32.715395 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:32.731575 1195787 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:37:32.731606 1195787 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:37:32.734910 1195787 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:37:32.734955 1195787 machine.go:94] provisionDockerMachine start ...
	I1218 00:37:32.735034 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.751418 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.751747 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.751760 1195787 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:37:32.904326 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:32.904350 1195787 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:37:32.904413 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.933199 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.933525 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.933536 1195787 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:37:33.096692 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:33.096816 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.115124 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.115445 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.115466 1195787 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:37:33.272592 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:37:33.272617 1195787 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:37:33.272637 1195787 ubuntu.go:190] setting up certificates
	I1218 00:37:33.272647 1195787 provision.go:84] configureAuth start
	I1218 00:37:33.272712 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:33.291737 1195787 provision.go:143] copyHostCerts
	I1218 00:37:33.291803 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291863 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:37:33.291880 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291977 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:37:33.292105 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292127 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:37:33.292137 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292177 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:37:33.292274 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292300 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:37:33.292315 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292347 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:37:33.292433 1195787 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:37:33.397529 1195787 provision.go:177] copyRemoteCerts
	I1218 00:37:33.397646 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:37:33.397692 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.416603 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:33.523879 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:37:33.523950 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:37:33.540143 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:37:33.540204 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:37:33.557091 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:37:33.557194 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:37:33.573937 1195787 provision.go:87] duration metric: took 301.27685ms to configureAuth
	I1218 00:37:33.573963 1195787 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:37:33.574138 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:33.574247 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.591351 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.591663 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.591676 1195787 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:37:33.932454 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:37:33.932478 1195787 machine.go:97] duration metric: took 1.197515142s to provisionDockerMachine
	I1218 00:37:33.932490 1195787 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:37:33.932503 1195787 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:37:33.932581 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:37:33.932636 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.953296 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.060199 1195787 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:37:34.063627 1195787 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:37:34.063655 1195787 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:37:34.063660 1195787 command_runner.go:130] > VERSION_ID="12"
	I1218 00:37:34.063664 1195787 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:37:34.063680 1195787 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:37:34.063684 1195787 command_runner.go:130] > ID=debian
	I1218 00:37:34.063689 1195787 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:37:34.063694 1195787 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:37:34.063700 1195787 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:37:34.063783 1195787 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:37:34.063800 1195787 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:37:34.063810 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:37:34.063871 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:37:34.063955 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:37:34.063966 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:37:34.064048 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:37:34.064056 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:37:34.064100 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:37:34.071756 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:34.089207 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:37:34.106978 1195787 start.go:296] duration metric: took 174.472072ms for postStartSetup
	I1218 00:37:34.107054 1195787 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:37:34.107096 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.124265 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.224786 1195787 command_runner.go:130] > 12%
	I1218 00:37:34.224858 1195787 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:37:34.228879 1195787 command_runner.go:130] > 171G
	I1218 00:37:34.229324 1195787 fix.go:56] duration metric: took 1.514188493s for fixHost
	I1218 00:37:34.229353 1195787 start.go:83] releasing machines lock for "functional-288604", held for 1.514233177s
	I1218 00:37:34.229425 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:34.246154 1195787 ssh_runner.go:195] Run: cat /version.json
	I1218 00:37:34.246206 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.246451 1195787 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:37:34.246509 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.266363 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.276260 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.371623 1195787 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:37:34.371754 1195787 ssh_runner.go:195] Run: systemctl --version
	I1218 00:37:34.461010 1195787 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:37:34.461057 1195787 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:37:34.461077 1195787 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:37:34.461152 1195787 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:37:34.497659 1195787 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:37:34.501645 1195787 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:37:34.502005 1195787 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:37:34.502070 1195787 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:37:34.509755 1195787 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:37:34.509780 1195787 start.go:496] detecting cgroup driver to use...
	I1218 00:37:34.509811 1195787 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:37:34.509875 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:37:34.523916 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:37:34.536646 1195787 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:37:34.536736 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:37:34.551504 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:37:34.564054 1195787 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:37:34.675890 1195787 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:37:34.798642 1195787 docker.go:234] disabling docker service ...
	I1218 00:37:34.798703 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:37:34.813006 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:37:34.825087 1195787 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:37:34.942798 1195787 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:37:35.067868 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:37:35.088600 1195787 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:37:35.102366 1195787 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:37:35.103752 1195787 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:37:35.103819 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.113147 1195787 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:37:35.113241 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.122530 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.131393 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.140799 1195787 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:37:35.148737 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.157396 1195787 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.165643 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.174650 1195787 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:37:35.181215 1195787 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:37:35.182122 1195787 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:37:35.189136 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.306446 1195787 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:37:35.483449 1195787 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:37:35.483550 1195787 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:37:35.487145 1195787 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:37:35.487172 1195787 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:37:35.487179 1195787 command_runner.go:130] > Device: 0,72	Inode: 1642        Links: 1
	I1218 00:37:35.487186 1195787 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:35.487202 1195787 command_runner.go:130] > Access: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487220 1195787 command_runner.go:130] > Modify: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487225 1195787 command_runner.go:130] > Change: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487229 1195787 command_runner.go:130] >  Birth: -
	I1218 00:37:35.487254 1195787 start.go:564] Will wait 60s for crictl version
	I1218 00:37:35.487306 1195787 ssh_runner.go:195] Run: which crictl
	I1218 00:37:35.490344 1195787 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:37:35.490683 1195787 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:37:35.512944 1195787 command_runner.go:130] > Version:  0.1.0
	I1218 00:37:35.513232 1195787 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:37:35.513363 1195787 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:37:35.513391 1195787 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:37:35.515559 1195787 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:37:35.515677 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.541522 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.541589 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.541609 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.541630 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.541651 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.541672 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.541692 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.541720 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.541741 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.541768 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.541786 1195787 command_runner.go:130] >      static
	I1218 00:37:35.541805 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.541829 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.541856 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.541889 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.541915 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.541933 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.541952 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.541983 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.541999 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.543191 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.569029 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.569102 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.569122 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.569144 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.569164 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.569191 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.569210 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.569239 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.569267 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.569285 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.569302 1195787 command_runner.go:130] >      static
	I1218 00:37:35.569320 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.569347 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.569366 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.569384 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.569405 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.569429 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.569449 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.569467 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.569485 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.575974 1195787 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:37:35.578737 1195787 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:37:35.594362 1195787 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:37:35.598161 1195787 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:37:35.598363 1195787 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFir
mwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:37:35.598485 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:35.598543 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.635547 1195787 command_runner.go:130] > {
	I1218 00:37:35.635578 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.635584 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635591 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.635596 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635602 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.635605 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635609 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635623 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.635631 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.635634 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635639 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.635643 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635650 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635654 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635657 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635668 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.635672 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635677 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.635680 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635684 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635693 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.635701 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.635704 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635709 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.635712 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635719 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635723 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635731 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.635735 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635740 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.635743 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635747 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635758 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.635773 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.635777 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635781 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.635786 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.635790 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635793 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635795 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635802 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.635805 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635810 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.635815 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635823 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635830 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.635838 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.635841 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635845 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.635848 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635852 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635855 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635864 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635868 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635872 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635875 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635881 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.635885 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635890 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.635893 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635897 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635905 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.635912 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.635915 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635920 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.635926 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635930 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635934 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635938 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635941 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635944 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635947 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635954 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.635957 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635963 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.635966 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635970 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635978 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.635986 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.635989 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635993 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.635997 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636000 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636003 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636007 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636011 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636013 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636016 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636022 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.636027 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636032 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.636035 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636039 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636046 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.636054 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.636057 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636060 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.636064 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636073 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636077 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636079 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636086 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.636090 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636095 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.636098 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636102 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636110 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.636125 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.636133 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636137 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.636140 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636144 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636147 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636151 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636154 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636158 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636160 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636166 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.636170 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636175 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.636178 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636182 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636190 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.636197 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.636200 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636204 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.636208 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636211 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.636214 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636238 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636243 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.636251 1195787 command_runner.go:130] >     }
	I1218 00:37:35.636254 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.636256 1195787 command_runner.go:130] > }
	I1218 00:37:35.636431 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.636439 1195787 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:37:35.636495 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.658094 1195787 command_runner.go:130] > {
	I1218 00:37:35.658111 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.658115 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658124 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.658128 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658134 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.658137 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658141 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658151 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.658159 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.658163 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658167 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.658171 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658176 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658179 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658182 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658189 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.658192 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658198 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.658201 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658205 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658213 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.658222 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.658225 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658229 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.658233 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658242 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658250 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658262 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658269 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.658273 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658279 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.658282 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658286 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658294 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.658302 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.658305 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658309 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.658313 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.658317 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658321 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658323 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658330 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.658334 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658339 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.658344 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658348 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658356 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.658367 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.658370 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658374 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.658378 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658382 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658384 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658393 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658397 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658400 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658403 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658410 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.658413 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658425 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.658431 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658435 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658443 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.658455 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.658465 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658469 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.658472 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658476 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658479 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658483 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658487 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658490 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658493 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658499 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.658503 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658508 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.658511 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658515 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658523 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.658532 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.658535 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658539 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.658543 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658549 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658552 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658556 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658560 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658563 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658566 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658572 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.658577 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658582 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.658589 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658598 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658605 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.658613 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.658616 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658620 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.658624 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658628 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658631 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658642 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658650 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.658653 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658659 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.658662 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658666 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658677 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.658694 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.658697 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658701 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.658705 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658708 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658711 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658715 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658718 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658721 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658731 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.658734 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658739 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.658742 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658746 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658754 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.658761 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.658772 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658777 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.658781 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658784 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.658788 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658791 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658794 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.658798 1195787 command_runner.go:130] >     }
	I1218 00:37:35.658800 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.658803 1195787 command_runner.go:130] > }
	I1218 00:37:35.660205 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.660262 1195787 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:37:35.660279 1195787 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:37:35.660385 1195787 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:37:35.660470 1195787 ssh_runner.go:195] Run: crio config
	I1218 00:37:35.707278 1195787 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:37:35.707300 1195787 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:37:35.707307 1195787 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:37:35.707310 1195787 command_runner.go:130] > #
	I1218 00:37:35.707318 1195787 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:37:35.707324 1195787 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:37:35.707330 1195787 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:37:35.707346 1195787 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:37:35.707349 1195787 command_runner.go:130] > # reload'.
	I1218 00:37:35.707356 1195787 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:37:35.707362 1195787 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:37:35.707368 1195787 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:37:35.707383 1195787 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:37:35.707387 1195787 command_runner.go:130] > [crio]
	I1218 00:37:35.707393 1195787 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:37:35.707398 1195787 command_runner.go:130] > # containers images, in this directory.
	I1218 00:37:35.707595 1195787 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:37:35.707607 1195787 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:37:35.707620 1195787 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:37:35.707627 1195787 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:37:35.707631 1195787 command_runner.go:130] > # imagestore = ""
	I1218 00:37:35.707637 1195787 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:37:35.707643 1195787 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:37:35.707768 1195787 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:37:35.707777 1195787 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:37:35.707784 1195787 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:37:35.707788 1195787 command_runner.go:130] > # storage_option = [
	I1218 00:37:35.707935 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.707952 1195787 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:37:35.707959 1195787 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:37:35.707971 1195787 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:37:35.707978 1195787 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:37:35.707984 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:37:35.707990 1195787 command_runner.go:130] > # always happen on a node reboot
	I1218 00:37:35.708138 1195787 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:37:35.708160 1195787 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:37:35.708174 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:37:35.708183 1195787 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:37:35.708342 1195787 command_runner.go:130] > # version_file_persist = ""
	I1218 00:37:35.708354 1195787 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:37:35.708363 1195787 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:37:35.708367 1195787 command_runner.go:130] > # internal_wipe = true
	I1218 00:37:35.708381 1195787 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:37:35.708388 1195787 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:37:35.708503 1195787 command_runner.go:130] > # internal_repair = true
	I1218 00:37:35.708512 1195787 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:37:35.708519 1195787 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:37:35.708525 1195787 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:37:35.708671 1195787 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:37:35.708682 1195787 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:37:35.708686 1195787 command_runner.go:130] > [crio.api]
	I1218 00:37:35.708706 1195787 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:37:35.708833 1195787 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:37:35.708843 1195787 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:37:35.708997 1195787 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:37:35.709007 1195787 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:37:35.709013 1195787 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:37:35.709016 1195787 command_runner.go:130] > # stream_port = "0"
	I1218 00:37:35.709022 1195787 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:37:35.709150 1195787 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:37:35.709160 1195787 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:37:35.709282 1195787 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:37:35.709292 1195787 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:37:35.709298 1195787 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709420 1195787 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:37:35.709430 1195787 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:37:35.709436 1195787 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709440 1195787 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:37:35.709447 1195787 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:37:35.709453 1195787 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:37:35.709462 1195787 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:37:35.709593 1195787 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:37:35.709614 1195787 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709735 1195787 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:37:35.709746 1195787 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709864 1195787 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:37:35.709875 1195787 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:37:35.709881 1195787 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:37:35.709885 1195787 command_runner.go:130] > [crio.runtime]
	I1218 00:37:35.709891 1195787 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:37:35.709896 1195787 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:37:35.709907 1195787 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:37:35.709913 1195787 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:37:35.709917 1195787 command_runner.go:130] > # default_ulimits = [
	I1218 00:37:35.710017 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710026 1195787 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:37:35.710154 1195787 command_runner.go:130] > # no_pivot = false
	I1218 00:37:35.710163 1195787 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:37:35.710170 1195787 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:37:35.710300 1195787 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:37:35.710309 1195787 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:37:35.710323 1195787 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:37:35.710336 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710476 1195787 command_runner.go:130] > # conmon = ""
	I1218 00:37:35.710485 1195787 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:37:35.710492 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:37:35.710496 1195787 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:37:35.710508 1195787 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:37:35.710514 1195787 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:37:35.710521 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710524 1195787 command_runner.go:130] > # conmon_env = [
	I1218 00:37:35.710624 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710633 1195787 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:37:35.710639 1195787 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:37:35.710644 1195787 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:37:35.710648 1195787 command_runner.go:130] > # default_env = [
	I1218 00:37:35.710790 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710800 1195787 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:37:35.710816 1195787 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:37:35.710953 1195787 command_runner.go:130] > # selinux = false
	I1218 00:37:35.710964 1195787 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:37:35.710972 1195787 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:37:35.710977 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.710981 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.710993 1195787 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:37:35.710999 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711131 1195787 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:37:35.711142 1195787 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:37:35.711149 1195787 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:37:35.711162 1195787 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:37:35.711169 1195787 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:37:35.711174 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711345 1195787 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:37:35.711373 1195787 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:37:35.711401 1195787 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:37:35.711419 1195787 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:37:35.711456 1195787 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:37:35.711477 1195787 command_runner.go:130] > # blockio parameters.
	I1218 00:37:35.711667 1195787 command_runner.go:130] > # blockio_reload = false
	I1218 00:37:35.711706 1195787 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:37:35.711725 1195787 command_runner.go:130] > # irqbalance daemon.
	I1218 00:37:35.711743 1195787 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:37:35.711776 1195787 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:37:35.711801 1195787 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:37:35.711821 1195787 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:37:35.711855 1195787 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:37:35.711879 1195787 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:37:35.711898 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.712052 1195787 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:37:35.712092 1195787 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:37:35.712112 1195787 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:37:35.712133 1195787 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:37:35.712151 1195787 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:37:35.712187 1195787 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:37:35.712206 1195787 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:37:35.712253 1195787 command_runner.go:130] > # will be added.
	I1218 00:37:35.712276 1195787 command_runner.go:130] > # default_capabilities = [
	I1218 00:37:35.712420 1195787 command_runner.go:130] > # 	"CHOWN",
	I1218 00:37:35.712461 1195787 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:37:35.712541 1195787 command_runner.go:130] > # 	"FSETID",
	I1218 00:37:35.712631 1195787 command_runner.go:130] > # 	"FOWNER",
	I1218 00:37:35.712660 1195787 command_runner.go:130] > # 	"SETGID",
	I1218 00:37:35.712794 1195787 command_runner.go:130] > # 	"SETUID",
	I1218 00:37:35.712896 1195787 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:37:35.712994 1195787 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:37:35.713065 1195787 command_runner.go:130] > # 	"KILL",
	I1218 00:37:35.713149 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.713172 1195787 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:37:35.713258 1195787 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:37:35.713410 1195787 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:37:35.713489 1195787 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:37:35.713545 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.713716 1195787 command_runner.go:130] > default_sysctls = [
	I1218 00:37:35.713734 1195787 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:37:35.713949 1195787 command_runner.go:130] > ]
	I1218 00:37:35.713959 1195787 command_runner.go:130] > # List of devices on the host that a
	I1218 00:37:35.713966 1195787 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:37:35.713970 1195787 command_runner.go:130] > # allowed_devices = [
	I1218 00:37:35.713995 1195787 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:37:35.714000 1195787 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:37:35.714003 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714008 1195787 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:37:35.714016 1195787 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:37:35.714022 1195787 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:37:35.714028 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.714032 1195787 command_runner.go:130] > # additional_devices = [
	I1218 00:37:35.714035 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714040 1195787 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:37:35.714044 1195787 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:37:35.714048 1195787 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:37:35.714052 1195787 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:37:35.714056 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714062 1195787 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:37:35.714068 1195787 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:37:35.714077 1195787 command_runner.go:130] > # Defaults to false.
	I1218 00:37:35.714083 1195787 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:37:35.714089 1195787 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:37:35.714100 1195787 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:37:35.714537 1195787 command_runner.go:130] > # hooks_dir = [
	I1218 00:37:35.714675 1195787 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:37:35.714791 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.715258 1195787 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:37:35.715414 1195787 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:37:35.715601 1195787 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:37:35.715650 1195787 command_runner.go:130] > #
	I1218 00:37:35.715843 1195787 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:37:35.715943 1195787 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:37:35.716060 1195787 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:37:35.716083 1195787 command_runner.go:130] > #
	I1218 00:37:35.716111 1195787 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:37:35.716131 1195787 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:37:35.716166 1195787 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:37:35.716187 1195787 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:37:35.716204 1195787 command_runner.go:130] > #
	I1218 00:37:35.716248 1195787 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:37:35.716275 1195787 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:37:35.716306 1195787 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:37:35.717368 1195787 command_runner.go:130] > # pids_limit = -1
	I1218 00:37:35.717418 1195787 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:37:35.717442 1195787 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:37:35.717463 1195787 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:37:35.717499 1195787 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:37:35.717521 1195787 command_runner.go:130] > # log_size_max = -1
	I1218 00:37:35.717693 1195787 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:37:35.717720 1195787 command_runner.go:130] > # log_to_journald = false
	I1218 00:37:35.717752 1195787 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:37:35.717776 1195787 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:37:35.717810 1195787 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:37:35.717835 1195787 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:37:35.717855 1195787 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:37:35.717888 1195787 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:37:35.717911 1195787 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:37:35.717929 1195787 command_runner.go:130] > # read_only = false
	I1218 00:37:35.717949 1195787 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:37:35.717978 1195787 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:37:35.717999 1195787 command_runner.go:130] > # live configuration reload.
	I1218 00:37:35.718017 1195787 command_runner.go:130] > # log_level = "info"
	I1218 00:37:35.718039 1195787 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:37:35.718073 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.718091 1195787 command_runner.go:130] > # log_filter = ""
	I1218 00:37:35.718112 1195787 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718144 1195787 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:37:35.718167 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718189 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718221 1195787 command_runner.go:130] > # uid_mappings = ""
	I1218 00:37:35.718243 1195787 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718262 1195787 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:37:35.718280 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718311 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718343 1195787 command_runner.go:130] > # gid_mappings = ""
	I1218 00:37:35.718363 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:37:35.718395 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718420 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718442 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718481 1195787 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:37:35.718507 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:37:35.718529 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718561 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718589 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718607 1195787 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:37:35.718641 1195787 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:37:35.718665 1195787 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:37:35.718685 1195787 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:37:35.718717 1195787 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:37:35.718741 1195787 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:37:35.718762 1195787 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:37:35.718793 1195787 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:37:35.718814 1195787 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:37:35.718831 1195787 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:37:35.718851 1195787 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:37:35.718882 1195787 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:37:35.718907 1195787 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:37:35.718931 1195787 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:37:35.718965 1195787 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:37:35.718989 1195787 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:37:35.719009 1195787 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:37:35.719039 1195787 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:37:35.719315 1195787 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:37:35.719348 1195787 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:37:35.719365 1195787 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:37:35.719396 1195787 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:37:35.719423 1195787 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:37:35.719450 1195787 command_runner.go:130] > # pinns_path = ""
	I1218 00:37:35.719484 1195787 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:37:35.719510 1195787 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:37:35.719528 1195787 command_runner.go:130] > # enable_criu_support = true
	I1218 00:37:35.719563 1195787 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:37:35.719586 1195787 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:37:35.719602 1195787 command_runner.go:130] > # enable_pod_events = false
	I1218 00:37:35.719622 1195787 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:37:35.719651 1195787 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:37:35.719672 1195787 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:37:35.719690 1195787 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:37:35.719711 1195787 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:37:35.719747 1195787 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:37:35.719770 1195787 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:37:35.719795 1195787 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:37:35.719826 1195787 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:37:35.719849 1195787 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:37:35.719865 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.719885 1195787 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:37:35.719916 1195787 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:37:35.719938 1195787 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:37:35.719957 1195787 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:37:35.719973 1195787 command_runner.go:130] > #
	I1218 00:37:35.720002 1195787 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:37:35.720024 1195787 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:37:35.720041 1195787 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:37:35.720059 1195787 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:37:35.720096 1195787 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:37:35.720121 1195787 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:37:35.720139 1195787 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:37:35.720170 1195787 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:37:35.720190 1195787 command_runner.go:130] > # monitor_env = []
	I1218 00:37:35.720207 1195787 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:37:35.720256 1195787 command_runner.go:130] > # allowed_annotations = []
	I1218 00:37:35.720274 1195787 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:37:35.720279 1195787 command_runner.go:130] > # no_sync_log = false
	I1218 00:37:35.720284 1195787 command_runner.go:130] > # default_annotations = {}
	I1218 00:37:35.720288 1195787 command_runner.go:130] > # stream_websockets = false
	I1218 00:37:35.720292 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.720348 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.720360 1195787 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:37:35.720367 1195787 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:37:35.720386 1195787 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:37:35.720399 1195787 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:37:35.720403 1195787 command_runner.go:130] > #   in $PATH.
	I1218 00:37:35.720418 1195787 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:37:35.720433 1195787 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:37:35.720439 1195787 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:37:35.720444 1195787 command_runner.go:130] > #   state.
	I1218 00:37:35.720451 1195787 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:37:35.720460 1195787 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:37:35.720466 1195787 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:37:35.720473 1195787 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:37:35.720480 1195787 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:37:35.720496 1195787 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:37:35.720506 1195787 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:37:35.720513 1195787 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:37:35.720531 1195787 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:37:35.720543 1195787 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:37:35.720550 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:37:35.720566 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:37:35.720576 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:37:35.720582 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:37:35.720590 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:37:35.720628 1195787 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:37:35.720649 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:37:35.720665 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:37:35.720671 1195787 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:37:35.720679 1195787 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:37:35.720689 1195787 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:37:35.720706 1195787 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:37:35.720719 1195787 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:37:35.720733 1195787 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:37:35.720746 1195787 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:37:35.720754 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:37:35.720760 1195787 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:37:35.720764 1195787 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:37:35.720772 1195787 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:37:35.720777 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:37:35.720783 1195787 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:37:35.720795 1195787 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:37:35.720813 1195787 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:37:35.720825 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:37:35.720833 1195787 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:37:35.720849 1195787 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:37:35.720857 1195787 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:37:35.720865 1195787 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:37:35.720875 1195787 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:37:35.720882 1195787 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:37:35.720888 1195787 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:37:35.720897 1195787 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:37:35.720905 1195787 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:37:35.720932 1195787 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:37:35.720946 1195787 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:37:35.720960 1195787 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:37:35.720968 1195787 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:37:35.720975 1195787 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:37:35.720983 1195787 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:37:35.720995 1195787 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:37:35.721013 1195787 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:37:35.721023 1195787 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:37:35.721031 1195787 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:37:35.721033 1195787 command_runner.go:130] > #
	I1218 00:37:35.721038 1195787 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:37:35.721043 1195787 command_runner.go:130] > #
	I1218 00:37:35.721049 1195787 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:37:35.721058 1195787 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:37:35.721061 1195787 command_runner.go:130] > #
	I1218 00:37:35.721072 1195787 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:37:35.721082 1195787 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:37:35.721085 1195787 command_runner.go:130] > #
	I1218 00:37:35.721091 1195787 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:37:35.721097 1195787 command_runner.go:130] > # feature.
	I1218 00:37:35.721100 1195787 command_runner.go:130] > #
	I1218 00:37:35.721106 1195787 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:37:35.721112 1195787 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:37:35.721119 1195787 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:37:35.721125 1195787 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:37:35.721131 1195787 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:37:35.721141 1195787 command_runner.go:130] > #
	I1218 00:37:35.721147 1195787 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:37:35.721153 1195787 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:37:35.721158 1195787 command_runner.go:130] > #
	I1218 00:37:35.721164 1195787 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:37:35.721170 1195787 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:37:35.721177 1195787 command_runner.go:130] > #
	I1218 00:37:35.721183 1195787 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:37:35.721188 1195787 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:37:35.721192 1195787 command_runner.go:130] > # limitation.
	I1218 00:37:35.721196 1195787 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:37:35.721200 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:37:35.721204 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721215 1195787 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:37:35.721228 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721232 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721236 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721241 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721248 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721251 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721255 1195787 command_runner.go:130] > allowed_annotations = [
	I1218 00:37:35.721261 1195787 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:37:35.721265 1195787 command_runner.go:130] > ]
	I1218 00:37:35.721270 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721274 1195787 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:37:35.721279 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:37:35.721282 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721289 1195787 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:37:35.721293 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721307 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721312 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721316 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721320 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721325 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721331 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721339 1195787 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:37:35.721347 1195787 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:37:35.721353 1195787 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:37:35.721361 1195787 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:37:35.721384 1195787 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:37:35.721399 1195787 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:37:35.721406 1195787 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:37:35.721417 1195787 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:37:35.721427 1195787 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:37:35.721438 1195787 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:37:35.721444 1195787 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:37:35.721457 1195787 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:37:35.721461 1195787 command_runner.go:130] > # Example:
	I1218 00:37:35.721466 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:37:35.721472 1195787 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:37:35.721477 1195787 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:37:35.721487 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:37:35.721490 1195787 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:37:35.721494 1195787 command_runner.go:130] > # cpushares = "5"
	I1218 00:37:35.721498 1195787 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:37:35.721502 1195787 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:37:35.721507 1195787 command_runner.go:130] > # cpulimit = "35"
	I1218 00:37:35.721510 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.721516 1195787 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:37:35.721524 1195787 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:37:35.721529 1195787 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:37:35.721535 1195787 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:37:35.721544 1195787 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:37:35.721552 1195787 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:37:35.721556 1195787 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:37:35.721563 1195787 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:37:35.721568 1195787 command_runner.go:130] > # Default value is set to true
	I1218 00:37:35.721574 1195787 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:37:35.721580 1195787 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:37:35.721588 1195787 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:37:35.721592 1195787 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:37:35.721621 1195787 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:37:35.721627 1195787 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:37:35.721635 1195787 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:37:35.721640 1195787 command_runner.go:130] > # timezone = ""
	I1218 00:37:35.721647 1195787 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:37:35.721650 1195787 command_runner.go:130] > #
	I1218 00:37:35.721656 1195787 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:37:35.721665 1195787 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:37:35.721672 1195787 command_runner.go:130] > [crio.image]
	I1218 00:37:35.721679 1195787 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:37:35.721683 1195787 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:37:35.721689 1195787 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:37:35.721701 1195787 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721706 1195787 command_runner.go:130] > # global_auth_file = ""
	I1218 00:37:35.721711 1195787 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:37:35.721723 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721728 1195787 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.721738 1195787 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:37:35.721745 1195787 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721754 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721758 1195787 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:37:35.721764 1195787 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:37:35.721769 1195787 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:37:35.721776 1195787 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:37:35.721781 1195787 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:37:35.721787 1195787 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:37:35.721793 1195787 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:37:35.721799 1195787 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:37:35.721805 1195787 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:37:35.721813 1195787 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:37:35.721819 1195787 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:37:35.721825 1195787 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:37:35.721831 1195787 command_runner.go:130] > # pinned_images = [
	I1218 00:37:35.721834 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721840 1195787 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:37:35.721846 1195787 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:37:35.721853 1195787 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:37:35.721859 1195787 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:37:35.721866 1195787 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:37:35.721871 1195787 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:37:35.721879 1195787 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:37:35.721892 1195787 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:37:35.721901 1195787 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:37:35.721912 1195787 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:37:35.721918 1195787 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:37:35.721923 1195787 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:37:35.721928 1195787 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:37:35.721935 1195787 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:37:35.721938 1195787 command_runner.go:130] > # changing them here.
	I1218 00:37:35.721944 1195787 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:37:35.721955 1195787 command_runner.go:130] > # insecure_registries = [
	I1218 00:37:35.721957 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721964 1195787 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:37:35.721969 1195787 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:37:35.721977 1195787 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:37:35.721983 1195787 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:37:35.721987 1195787 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:37:35.721998 1195787 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:37:35.722005 1195787 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:37:35.722009 1195787 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:37:35.722015 1195787 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:37:35.722024 1195787 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:37:35.722031 1195787 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:37:35.722036 1195787 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:37:35.722048 1195787 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:37:35.722054 1195787 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:37:35.722062 1195787 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:37:35.722070 1195787 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:37:35.722074 1195787 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:37:35.722081 1195787 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:37:35.722087 1195787 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:37:35.722091 1195787 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:37:35.722097 1195787 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:37:35.722108 1195787 command_runner.go:130] > # CNI plugins.
	I1218 00:37:35.722117 1195787 command_runner.go:130] > [crio.network]
	I1218 00:37:35.722131 1195787 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:37:35.722136 1195787 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:37:35.722142 1195787 command_runner.go:130] > # cni_default_network = ""
	I1218 00:37:35.722148 1195787 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:37:35.722156 1195787 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:37:35.722162 1195787 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:37:35.722165 1195787 command_runner.go:130] > # plugin_dirs = [
	I1218 00:37:35.722169 1195787 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:37:35.722172 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722176 1195787 command_runner.go:130] > # List of included pod metrics.
	I1218 00:37:35.722180 1195787 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:37:35.722182 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722190 1195787 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:37:35.722196 1195787 command_runner.go:130] > [crio.metrics]
	I1218 00:37:35.722201 1195787 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:37:35.722205 1195787 command_runner.go:130] > # enable_metrics = false
	I1218 00:37:35.722209 1195787 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:37:35.722215 1195787 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:37:35.722222 1195787 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:37:35.722233 1195787 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:37:35.722239 1195787 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:37:35.722243 1195787 command_runner.go:130] > # metrics_collectors = [
	I1218 00:37:35.722247 1195787 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:37:35.722252 1195787 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:37:35.722256 1195787 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:37:35.722260 1195787 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:37:35.722266 1195787 command_runner.go:130] > # 	"operations_total",
	I1218 00:37:35.722270 1195787 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:37:35.722275 1195787 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:37:35.722279 1195787 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:37:35.722283 1195787 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:37:35.722287 1195787 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:37:35.722295 1195787 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:37:35.722299 1195787 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:37:35.722312 1195787 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:37:35.722316 1195787 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:37:35.722321 1195787 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:37:35.722325 1195787 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:37:35.722329 1195787 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:37:35.722332 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722338 1195787 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:37:35.722342 1195787 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:37:35.722347 1195787 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:37:35.722351 1195787 command_runner.go:130] > # metrics_port = 9090
	I1218 00:37:35.722358 1195787 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:37:35.722362 1195787 command_runner.go:130] > # metrics_socket = ""
	I1218 00:37:35.722377 1195787 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:37:35.722386 1195787 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:37:35.722398 1195787 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:37:35.722403 1195787 command_runner.go:130] > # certificate on any modification event.
	I1218 00:37:35.722406 1195787 command_runner.go:130] > # metrics_cert = ""
	I1218 00:37:35.722411 1195787 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:37:35.722421 1195787 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:37:35.722424 1195787 command_runner.go:130] > # metrics_key = ""
	I1218 00:37:35.722433 1195787 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:37:35.722437 1195787 command_runner.go:130] > [crio.tracing]
	I1218 00:37:35.722445 1195787 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:37:35.722451 1195787 command_runner.go:130] > # enable_tracing = false
	I1218 00:37:35.722464 1195787 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:37:35.722472 1195787 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:37:35.722479 1195787 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:37:35.722485 1195787 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:37:35.722490 1195787 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:37:35.722493 1195787 command_runner.go:130] > [crio.nri]
	I1218 00:37:35.722498 1195787 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:37:35.722507 1195787 command_runner.go:130] > # enable_nri = true
	I1218 00:37:35.722519 1195787 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:37:35.722524 1195787 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:37:35.722528 1195787 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:37:35.722539 1195787 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:37:35.722544 1195787 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:37:35.722549 1195787 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:37:35.722557 1195787 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:37:35.722613 1195787 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:37:35.722623 1195787 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:37:35.722628 1195787 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:37:35.722634 1195787 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:37:35.722640 1195787 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:37:35.722645 1195787 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:37:35.722651 1195787 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:37:35.722658 1195787 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:37:35.722663 1195787 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:37:35.722666 1195787 command_runner.go:130] > # - OCI hook injection
	I1218 00:37:35.722671 1195787 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:37:35.722677 1195787 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:37:35.722683 1195787 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:37:35.722689 1195787 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:37:35.722696 1195787 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:37:35.722702 1195787 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:37:35.722709 1195787 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:37:35.722712 1195787 command_runner.go:130] > #
	I1218 00:37:35.722717 1195787 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:37:35.722724 1195787 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:37:35.722729 1195787 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:37:35.722734 1195787 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:37:35.722739 1195787 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:37:35.722744 1195787 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:37:35.722749 1195787 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:37:35.722759 1195787 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:37:35.722765 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722771 1195787 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:37:35.722777 1195787 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:37:35.722788 1195787 command_runner.go:130] > [crio.stats]
	I1218 00:37:35.722797 1195787 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:37:35.722805 1195787 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:37:35.722809 1195787 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:37:35.722814 1195787 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:37:35.722821 1195787 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:37:35.722825 1195787 command_runner.go:130] > # collection_period = 0
	I1218 00:37:35.722870 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686277403Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:37:35.722885 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686455769Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:37:35.722906 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686635242Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:37:35.722915 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686725939Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:37:35.722930 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686860827Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.722940 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.687143526Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:37:35.722954 1195787 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:37:35.723070 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:35.723084 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:35.723105 1195787 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:37:35.723135 1195787 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath
:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:37:35.723264 1195787 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:37:35.723342 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:37:35.730799 1195787 command_runner.go:130] > kubeadm
	I1218 00:37:35.730815 1195787 command_runner.go:130] > kubectl
	I1218 00:37:35.730820 1195787 command_runner.go:130] > kubelet
	I1218 00:37:35.730852 1195787 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:37:35.730903 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:37:35.737892 1195787 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:37:35.749699 1195787 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:37:35.761635 1195787 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2219 bytes)
	I1218 00:37:35.773650 1195787 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:37:35.777155 1195787 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:37:35.777265 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.913809 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:36.641224 1195787 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:37:36.641246 1195787 certs.go:195] generating shared ca certs ...
	I1218 00:37:36.641263 1195787 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:36.641410 1195787 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:37:36.641464 1195787 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:37:36.641475 1195787 certs.go:257] generating profile certs ...
	I1218 00:37:36.641577 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:37:36.641667 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:37:36.641711 1195787 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:37:36.641724 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:37:36.641737 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:37:36.641753 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:37:36.641763 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:37:36.641780 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:37:36.641792 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:37:36.641807 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:37:36.641818 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:37:36.641873 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:37:36.641907 1195787 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:37:36.641920 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:37:36.641952 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:37:36.641982 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:37:36.642014 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:37:36.642068 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:36.642106 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:37:36.642122 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.642133 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.642704 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:37:36.662928 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:37:36.685489 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:37:36.708038 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:37:36.726679 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:37:36.744109 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:37:36.760724 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:37:36.777802 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:37:36.794736 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:37:36.811089 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:37:36.827838 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:37:36.844718 1195787 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:37:36.856626 1195787 ssh_runner.go:195] Run: openssl version
	I1218 00:37:36.862122 1195787 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:37:36.862595 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.869813 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:37:36.876968 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880287 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880319 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880364 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.920445 1195787 command_runner.go:130] > b5213941
	I1218 00:37:36.920887 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:37:36.928015 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.934857 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:37:36.941992 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945456 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945522 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945583 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.985712 1195787 command_runner.go:130] > 51391683
	I1218 00:37:36.986191 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:37:36.993294 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.001803 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:37:37.011590 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.016819 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017267 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017348 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.061113 1195787 command_runner.go:130] > 3ec20f2e
	I1218 00:37:37.061606 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:37:37.068668 1195787 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072025 1195787 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072050 1195787 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:37:37.072057 1195787 command_runner.go:130] > Device: 259,1	Inode: 1326178     Links: 1
	I1218 00:37:37.072063 1195787 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:37.072070 1195787 command_runner.go:130] > Access: 2025-12-18 00:33:28.828061434 +0000
	I1218 00:37:37.072075 1195787 command_runner.go:130] > Modify: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072080 1195787 command_runner.go:130] > Change: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072086 1195787 command_runner.go:130] >  Birth: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072155 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:37:37.111978 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.112489 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:37:37.152999 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.153074 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:37:37.194884 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.195292 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:37:37.235218 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.235658 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:37:37.275710 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.276177 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:37:37.316082 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.316486 1195787 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwa
rePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:37.316593 1195787 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:37:37.316685 1195787 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:37:37.341722 1195787 cri.go:89] found id: ""
	I1218 00:37:37.341828 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:37:37.348335 1195787 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:37:37.348357 1195787 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:37:37.348372 1195787 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:37:37.349183 1195787 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:37:37.349197 1195787 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:37:37.349253 1195787 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:37:37.356307 1195787 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:37:37.356734 1195787 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.356836 1195787 kubeconfig.go:62] /home/jenkins/minikube-integration/22186-1156339/kubeconfig needs updating (will repair): [kubeconfig missing "functional-288604" cluster setting kubeconfig missing "functional-288604" context setting]
	I1218 00:37:37.357097 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.357514 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.357675 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.358178 1195787 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:37:37.358185 1195787 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:37:37.358343 1195787 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:37:37.358365 1195787 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:37:37.358389 1195787 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:37:37.358400 1195787 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:37:37.358747 1195787 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:37:37.366250 1195787 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:37:37.366287 1195787 kubeadm.go:602] duration metric: took 17.084351ms to restartPrimaryControlPlane
	I1218 00:37:37.366297 1195787 kubeadm.go:403] duration metric: took 49.819997ms to StartCluster
	I1218 00:37:37.366310 1195787 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.366369 1195787 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.366947 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.367145 1195787 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:37:37.367532 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:37.367580 1195787 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:37:37.367705 1195787 addons.go:70] Setting storage-provisioner=true in profile "functional-288604"
	I1218 00:37:37.367724 1195787 addons.go:239] Setting addon storage-provisioner=true in "functional-288604"
	I1218 00:37:37.367744 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.368436 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.368583 1195787 addons.go:70] Setting default-storageclass=true in profile "functional-288604"
	I1218 00:37:37.368601 1195787 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-288604"
	I1218 00:37:37.368944 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.373199 1195787 out.go:179] * Verifying Kubernetes components...
	I1218 00:37:37.376080 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:37.397822 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.397983 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.398246 1195787 addons.go:239] Setting addon default-storageclass=true in "functional-288604"
	I1218 00:37:37.398278 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.398894 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.407451 1195787 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:37:37.410300 1195787 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.410322 1195787 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:37:37.410384 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.434096 1195787 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:37.434117 1195787 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:37:37.434174 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.457842 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.477819 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.583963 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:37.618382 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.637024 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.392142 1195787 node_ready.go:35] waiting up to 6m0s for node "functional-288604" to be "Ready" ...
	I1218 00:37:38.392289 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.392356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.392602 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392638 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392662 1195787 retry.go:31] will retry after 293.380468ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392710 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392727 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392733 1195787 retry.go:31] will retry after 283.333163ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:38.676355 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.686660 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:38.750557 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753745 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753775 1195787 retry.go:31] will retry after 508.906429ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753840 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753899 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753916 1195787 retry.go:31] will retry after 283.918132ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.893115 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.893199 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.893535 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.038817 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.092066 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.095485 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.095518 1195787 retry.go:31] will retry after 317.14343ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.262906 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.318327 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.322166 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.322196 1195787 retry.go:31] will retry after 611.398612ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.392378 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.413200 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.474250 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.474288 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.474326 1195787 retry.go:31] will retry after 551.991324ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.892368 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.892757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.933930 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.991113 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.991153 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.991172 1195787 retry.go:31] will retry after 590.272449ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.027415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:40.085906 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.089482 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.089515 1195787 retry.go:31] will retry after 1.798316027s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.392931 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.393007 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.393310 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:40.393376 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:40.582668 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:40.643859 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.643900 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.643941 1195787 retry.go:31] will retry after 1.196819353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.892768 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.392495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.841577 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:41.888099 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:41.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.901267 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.901306 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.901323 1195787 retry.go:31] will retry after 1.106575841s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948402 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.948447 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948500 1195787 retry.go:31] will retry after 1.314106681s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:42.393054 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.393195 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.393477 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:42.393524 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:42.893249 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.893594 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.008894 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:43.066157 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.066194 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.066212 1195787 retry.go:31] will retry after 2.952953914s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.263490 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:43.325047 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.325147 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.325201 1195787 retry.go:31] will retry after 2.165088511s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.892529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.892853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.392698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.892859 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:44.892927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:45.392615 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.393055 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:45.491313 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:45.548834 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:45.552259 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.552290 1195787 retry.go:31] will retry after 4.009218302s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.020180 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:46.081331 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:46.081373 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.081392 1195787 retry.go:31] will retry after 2.724964309s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.392810 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.392886 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.393216 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.893121 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.893451 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:46.893527 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:47.392312 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.392379 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.392690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:47.892435 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.892508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.806570 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:48.859925 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:48.863450 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.863523 1195787 retry.go:31] will retry after 5.125713123s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.892640 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.892710 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.892972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:49.392419 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.392858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:49.392930 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:49.562244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:49.616549 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:49.619912 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.619976 1195787 retry.go:31] will retry after 7.525324152s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.893380 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.893476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.893792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.392521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.892413 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.392501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:51.892896 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:52.392394 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:52.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.892886 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.892492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.990244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:54.052144 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:54.052189 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.052212 1195787 retry.go:31] will retry after 10.028215297s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:54.392879 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:54.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.892892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.892535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:56.892936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:57.146448 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:57.223873 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:57.223911 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.223929 1195787 retry.go:31] will retry after 7.68443688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.392441 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.392757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:57.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.892896 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.892496 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.892902 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:58.892976 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:59.392373 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.392729 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:59.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.392605 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.392823 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.393374 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.893180 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.893259 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.893590 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:00.893634 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:01.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.392775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:01.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.892560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.392629 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.393091 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.893045 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.893120 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.893431 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:03.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.393360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.393682 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:03.393752 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:03.892452 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.892588 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.081415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:04.149098 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.149154 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.149173 1195787 retry.go:31] will retry after 12.181474759s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.392826 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.892706 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.908952 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:04.983582 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.983679 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.983707 1195787 retry.go:31] will retry after 20.674508131s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:05.393152 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.393222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.393548 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:05.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:05.892840 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:06.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:06.892466 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.892581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.392796 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.392870 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.393185 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.893008 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.893099 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.893411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:07.893460 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:08.393200 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.393269 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.393580 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:08.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.392350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:10.392470 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.392542 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:10.392885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:10.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.392567 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.392748 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.392724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.892526 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.892600 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.892927 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:12.892994 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:13.392672 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.393083 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:13.892350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.892423 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:15.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.392797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:15.392868 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:15.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.331590 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:16.385966 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:16.389831 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.389871 1195787 retry.go:31] will retry after 10.81475415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.393176 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.393493 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.893314 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.893409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.893794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:17.392528 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.392670 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:17.393070 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:17.892892 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.892977 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.893319 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.393167 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.393296 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:19.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.392531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.393011 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:19.393093 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:19.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.892493 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.392457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.392540 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.892404 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.892505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.892840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.392336 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.392752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:21.892899 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:22.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.392824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:22.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.892650 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.892812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:24.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.392459 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.392739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:24.392822 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:24.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.392505 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.392578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.392919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.658449 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:25.718689 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:25.718787 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.718811 1195787 retry.go:31] will retry after 20.411460434s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.893032 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.893152 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.893496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:26.393268 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.393345 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.393658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:26.393735 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:26.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.205308 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:27.264744 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:27.264795 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.264821 1195787 retry.go:31] will retry after 26.872581906s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.393247 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.393343 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.393691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.392400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.392499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.892880 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:28.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:29.392435 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.392530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:29.892518 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.892615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.892959 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.392483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.892684 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:30.893088 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:31.392443 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.392538 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:31.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.392398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.892495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:33.392363 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.392786 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:33.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:33.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.892942 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:35.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:35.392919 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:35.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.892552 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.892909 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:37.392676 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.392747 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.393087 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:37.393163 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:37.892815 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.892918 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.893191 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.392972 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.393069 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.393395 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.893206 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.893283 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.893614 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.892914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:39.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:40.392478 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:40.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.392429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.892557 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.892907 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:42.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.392468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:42.392895 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.892798 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.893169 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.392965 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.393041 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.393425 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.893065 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.893192 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.893542 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:44.393319 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.393457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.393797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:44.393864 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:44.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.892887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.392407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.392691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.892831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.131350 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:46.207148 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:46.207192 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.207211 1195787 retry.go:31] will retry after 46.493082425s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.392632 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.393042 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.892356 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.892439 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:46.892784 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:47.392561 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.392655 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.393022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:47.892957 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.893052 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.393028 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.393151 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.393502 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.893231 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.893339 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:48.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:49.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.892410 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.892719 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.392567 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.392922 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.892639 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.892718 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.893017 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:51.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:51.392927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:51.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.392587 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.392689 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.393078 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.892911 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.893278 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:53.393104 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.393175 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.393506 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:53.393578 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:53.893174 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.893253 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.893635 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.138097 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:54.199604 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:54.199639 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.199657 1195787 retry.go:31] will retry after 32.999586692s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.392915 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.392997 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.393320 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.893151 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.893222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.893558 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:55.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.393372 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.393696 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:55.393771 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:55.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.392482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.392536 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:57.892898 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:58.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.392510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.392842 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.892427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.892690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.392771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:00.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.392852 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:00.392906 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:00.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.392554 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.392642 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.392932 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:02.392460 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:02.392937 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:02.893082 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.893161 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.893517 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.393213 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.393335 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.393710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.893298 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.893393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.893695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.392345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.392774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.892322 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.892394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:04.892799 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:05.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:05.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.892687 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.893007 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.392320 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.392389 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.392734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:06.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:07.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.392661 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.393015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:07.892841 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.892966 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.893308 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.393076 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.393143 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.393465 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.893245 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.893642 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:08.893706 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:09.392340 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:09.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.392603 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.393041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.892396 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.892674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:11.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.392788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:11.392854 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:11.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.892864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.392694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:13.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.392558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:13.392904 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:13.892358 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.392462 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.892493 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.892568 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.892916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.392728 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:15.892902 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:16.392592 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.393051 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:16.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.892766 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.392504 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.392612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.392914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.892926 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.893001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:17.893380 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:18.393186 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.393287 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.393589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:18.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.892436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.892724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.892501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:20.392577 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.392677 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:20.393034 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:20.892700 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.892780 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.893096 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.392884 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.392972 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.393246 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.893036 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.893115 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.893439 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:22.393105 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.393180 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.393531 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:22.393602 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:22.893283 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.893357 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.893654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.392360 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.392759 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.892410 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.892763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.392324 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.392671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.892367 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:24.892851 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:25.392405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.392832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:25.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.892598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:26.893059 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:27.199423 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:39:27.258871 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262674 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262814 1195787 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:27.393144 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.393338 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.393739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:27.892445 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.892515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.392686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.393001 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.892388 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:29.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:29.392759 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:29.892449 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.892575 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.892898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.892522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:31.392593 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.392698 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.393057 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:31.393175 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.892746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.393076 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.700811 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:39:32.759422 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759519 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759610 1195787 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:32.762569 1195787 out.go:179] * Enabled addons: 
	I1218 00:39:32.766452 1195787 addons.go:530] duration metric: took 1m55.398865574s for enable addons: enabled=[]
	I1218 00:39:32.892720 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.892834 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.893134 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:33.392885 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.392951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.393266 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:33.393360 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:33.893120 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.893193 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.893559 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.393379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.393480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.393839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.892868 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.392614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.892537 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:35.892942 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:36.392383 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.392802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:36.892719 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.892802 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.893187 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.393130 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.393210 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.393536 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.892374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:38.392658 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.392729 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:38.393158 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:38.892929 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.893353 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.393129 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.393197 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.393452 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.893242 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.893317 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.893673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.392517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.892369 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.892525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:40.892938 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:41.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.392798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:41.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.892875 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.392446 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.892629 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.892701 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.893027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:42.893091 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:43.392773 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.392853 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.393188 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:43.892986 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.893071 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.893334 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.393187 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.393481 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.893282 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.893350 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.893683 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:44.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:45.392327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.392700 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:45.892412 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.392830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.892694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:47.392541 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:47.392943 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:47.892913 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.892990 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.392935 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.393001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.393289 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.893086 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.893157 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.893470 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:49.393330 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.393420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.393740 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:49.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:49.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.892391 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.892665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.392473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.892447 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.392762 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.892406 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:51.892823 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:52.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:52.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.392422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.392736 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.892489 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.892612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:53.893004 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:54.392376 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:54.892366 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.892778 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:56.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:56.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:56.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.392473 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.892694 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.892772 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.892739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:58.892789 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:59.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.392532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:59.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.892633 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.392899 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.892419 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:00.892885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:01.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:01.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.892996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.392436 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.392515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.892688 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.892766 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:02.893136 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:03.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:03.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.392791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.892671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:05.392381 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:05.392855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:05.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:07.392557 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.392630 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.392948 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:07.393044 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:07.892947 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.893292 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.393116 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.393189 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.393503 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.893257 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.893330 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.893657 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:09.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.393365 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.393623 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:09.393667 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:09.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.892429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.392530 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.892422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.892749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.892544 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.892620 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:11.893010 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:12.392339 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.392715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:12.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.892814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:14.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.392813 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:14.392867 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:14.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.392448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.892562 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.892645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.892993 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:16.392714 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.392789 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.393108 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:16.393181 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:16.892981 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.893057 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.893318 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.393280 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.393362 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.393715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.392754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.892443 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.892891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:18.892951 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:19.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.392715 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.393048 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:19.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.392464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.392845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:21.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:21.392721 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:21.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.392527 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:23.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:23.392882 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:23.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.392684 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:25.392518 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.392622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.393004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:25.393058 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:25.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.892660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.392355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.892589 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.392982 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.892845 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.892928 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.893247 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:27.893296 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:28.392780 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.392854 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.393198 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:28.892980 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.893051 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.893305 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.393025 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.393097 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.393406 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.893167 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.893246 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.893569 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:29.893625 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:30.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.392419 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.392749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:30.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.892844 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.392817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.892407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:32.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.392882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:32.392936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:32.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.892506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.392470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.892473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:34.892850 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:35.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.392811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:35.892523 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.892622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.892978 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.392672 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.892754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.392604 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:37.393019 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:37.892964 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.893061 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.893356 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.393199 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.393280 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.393633 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.892784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.392710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.892802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:39.892855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:40.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:40.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.892649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.892534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:42.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.392664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:42.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.892597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.392651 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.392720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.393047 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.892468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.892851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:44.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.392461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:44.392853 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:44.892484 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.892559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.892919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.392680 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.892767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.892679 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:46.892729 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:47.392534 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.392616 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:47.893009 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.893086 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.893414 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.393173 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.393243 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.393496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.893310 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.893387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.893701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:48.893755 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:49.392365 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.392767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:49.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.892412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.392433 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:51.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.392457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:51.392745 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:51.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.392597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.392916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.892836 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.892908 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.893174 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:53.393008 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.393079 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.393392 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:53.393445 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:53.893170 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.893250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.893577 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.393256 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.393323 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.393624 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.892833 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.892909 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.893230 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.392807 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.392879 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.393195 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.892903 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.892983 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.893248 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:55.893295 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:56.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.393164 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.393494 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:56.893233 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.893303 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.893625 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.392308 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.892561 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.892640 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.893004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:58.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.392770 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:58.392830 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:58.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.392417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.392375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.392732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:00.892926 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:01.392600 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.392680 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.393027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:01.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.392459 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.392534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.392891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.892379 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:03.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.392673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:03.392717 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:03.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.892520 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.892803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:05.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:05.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:05.892501 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.892578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.892871 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.392458 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.392846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.892564 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.892651 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:07.392849 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.392927 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.393249 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:07.393300 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:07.893061 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.893141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.893407 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.393571 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.893213 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.893285 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.893586 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:09.393338 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.393409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.393737 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:09.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:09.892310 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.892384 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.892460 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.892531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.392821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.892452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:11.892845 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:12.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.392447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:12.892659 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.893041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.392855 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.892343 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:14.392354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.392431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:14.392815 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:14.892477 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.892893 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:16.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.392508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:16.392886 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:16.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.392663 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.393122 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.893034 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.893112 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.893434 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:18.393225 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.393298 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.393566 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:18.393619 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:18.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.892776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.392463 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.392545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:20.892875 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:21.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.392697 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:21.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.392426 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.892691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:23.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:23.392897 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:23.892549 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.892621 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.892930 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.392371 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:25.392487 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.392571 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.392908 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:25.392969 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:25.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.892701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.392465 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.392539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.892616 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.892694 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.892997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:27.893042 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:28.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:28.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.392438 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.892380 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:30.392322 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:30.392775 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:30.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.392484 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.392563 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.392894 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:32.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:32.392859 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:32.892645 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.892720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.893273 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.393053 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.393128 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.393383 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.893197 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.893271 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.893602 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.392329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.892409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:34.892764 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:35.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:35.892454 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.392716 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:36.892870 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:37.392455 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.392936 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:37.893011 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.893129 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.893427 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.393291 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.393628 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.893349 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.893718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:38.893795 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:39.393265 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.393337 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.393588 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:39.893331 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.893408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.893734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.892710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:41.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:41.392842 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:41.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.892453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.892740 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.893071 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:43.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.392714 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.393030 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:43.393087 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:43.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.892423 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.892741 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:45.892958 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:46.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:46.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.892695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.392510 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.392586 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.392951 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:48.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.392712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:48.392768 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:48.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.392787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.892646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:50.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:50.392909 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:50.892579 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.892652 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.892985 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.392709 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.392792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.892354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:52.892725 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:53.392414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.392831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:53.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:54.892908 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:55.392584 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.392665 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.392996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:55.892326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.892800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.392401 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.392474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.892944 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:56.892999 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:57.392855 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.392925 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:57.893155 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.893229 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.893537 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.393357 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.393431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.393713 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:59.392495 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.392587 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.392917 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:59.392973 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:59.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.893052 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:01.392764 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.392860 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:01.393227 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:01.892948 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.893270 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.393163 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.393444 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.892430 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.892742 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.892755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:03.892801 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:04.392448 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.392523 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:04.892336 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.892677 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:05.892848 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:06.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:06.892407 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.892929 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.392690 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.392765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.393085 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.893075 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.893147 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:07.893442 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:08.393253 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.393325 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.393644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:08.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.392738 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:10.392423 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:10.392894 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:10.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.392445 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.392851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.892553 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.892632 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.892973 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.392666 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:12.892876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:13.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.392815 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:13.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.392377 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:14.892953 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:15.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.392701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:15.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.392442 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.392833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.892667 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:17.392522 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.392590 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:17.392921 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:17.892652 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.892728 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.893037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.892817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.392472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.892338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:19.892782 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:20.392466 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:20.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:21.892849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:22.392532 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.392615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.392957 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:22.892669 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.892988 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.392497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.892580 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.892912 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:23.892972 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:24.392362 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.392433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.392780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:24.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.892471 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.392469 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.392543 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:26.392450 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.392522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.392876 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:26.392935 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:26.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.892686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.392919 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.392992 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.393253 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.893175 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.893249 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.893584 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.392338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.892451 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:28.892862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:29.392512 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.392870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:29.892578 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.892656 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.392314 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.893263 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.893356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.893732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:30.893797 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:31.392476 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.392560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:31.892342 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.892698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.392782 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:33.392366 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.392818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:33.392876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:33.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.892447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.392507 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.392934 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.892413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:35.392488 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:35.392932 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:35.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.892504 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.392359 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.392660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.393104 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:37.393155 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:37.892886 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.892951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.893194 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.392988 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.393067 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.393390 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.893201 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.893273 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.893589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:39.393344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.393415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.393662 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:39.393702 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:39.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.392451 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.392529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.392862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.892870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:42.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:42.892516 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.892611 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.892938 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.392638 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.392711 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.393037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.892699 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.892765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:43.893055 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:44.392565 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.392645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.392956 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:44.892656 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.892731 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.893058 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.392581 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.392846 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.393290 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.893123 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.893447 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:45.893502 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:46.393297 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.393390 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.393780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:46.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.892799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.392560 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.392631 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.392960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.892485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:48.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.392663 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:48.392703 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:48.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.392544 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.392943 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:50.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:50.392849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:50.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.392654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.892315 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.892388 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.892718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:52.392427 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:52.392889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:52.892324 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.892546 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.892874 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.392387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.892485 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.892882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:54.892940 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:55.392636 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.393066 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:55.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:57.392739 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.392818 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.393074 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:57.393125 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:57.892970 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.893044 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.893385 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.393560 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.893294 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.893360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.893621 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.892745 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:59.892790 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:00.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.392421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:00.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.892467 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.392584 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:02.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.392819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:02.392871 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:02.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.392644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.392425 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.892634 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:04.892673 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:05.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:05.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.892809 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.392658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:06.892843 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:07.392572 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.392646 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:07.892874 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.892944 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.893199 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.393002 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.393080 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.393411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.893209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.893286 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.893674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:08.893724 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:09.392325 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.392655 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:09.892370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.392431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:11.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:11.392883 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:11.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.892939 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.392370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.892624 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:13.392523 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.392903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:13.392963 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:13.892519 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.892431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.892510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.392321 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.892434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.892732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:15.892778 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:16.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:16.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.893341 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.893646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.392499 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.392579 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.892490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:17.892807 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:18.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:18.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.393074 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.393141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.393442 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.893090 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.893170 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.893422 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:19.893473 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:20.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.393284 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.393611 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:20.893284 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.893361 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:22.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:22.392844 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:22.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.892425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.892708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.392485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.392394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.392659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.892838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:24.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:25.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:25.892357 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.892457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.892775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.392472 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.892582 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.892678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.893022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:26.893073 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:27.392724 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.392791 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.393032 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:27.893016 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.893101 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.893433 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.393326 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.892772 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:29.392456 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.392869 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:29.392915 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:29.892602 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.892673 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.892440 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.392566 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.892417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:31.892716 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:32.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.392776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:32.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.892858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:33.392332 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.397972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1218 00:43:33.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:33.892918 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:34.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:34.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.892442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.892725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.392438 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.392511 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.892534 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.892662 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:35.893039 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:36.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:36.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.392739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.892912 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.892985 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.893251 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:37.893290 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:38.393068 1195787 node_ready.go:38] duration metric: took 6m0.000870722s for node "functional-288604" to be "Ready" ...
	I1218 00:43:38.396243 1195787 out.go:203] 
	W1218 00:43:38.399208 1195787 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:43:38.399223 1195787 out.go:285] * 
	* 
	W1218 00:43:38.401353 1195787 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:43:38.404386 1195787 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-288604 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 6m6.572876053s for "functional-288604" cluster.
I1218 00:43:39.016533 1159552 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (361.269853ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 logs -n 25: (1.030645924s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                           ARGS                                                                            │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-240845 ssh sudo cat /usr/share/ca-certificates/1159552.pem                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ image          │ functional-240845 image save kicbase/echo-server:functional-240845 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                  │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ image          │ functional-240845 image rm kicbase/echo-server:functional-240845 --alsologtostderr                                                                        │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/11595522.pem                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /usr/share/ca-certificates/11595522.pem                                                                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                  │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/test/nested/copy/1159552/hosts                                                                                        │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image save --daemon kicbase/echo-server:functional-240845 --alsologtostderr                                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format json --alsologtostderr                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh pgrep buildkitd                                                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image          │ functional-240845 image ls --format yaml --alsologtostderr                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format table --alsologtostderr                                                                                               │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format short --alsologtostderr                                                                                               │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete         │ -p functional-240845                                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start          │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ start          │ -p functional-288604 --alsologtostderr -v=8                                                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:37 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:37:32
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:37:32.486183 1195787 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:37:32.486610 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486624 1195787 out.go:374] Setting ErrFile to fd 2...
	I1218 00:37:32.486629 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486918 1195787 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:37:32.487313 1195787 out.go:368] Setting JSON to false
	I1218 00:37:32.488152 1195787 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26401,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:37:32.488255 1195787 start.go:143] virtualization:  
	I1218 00:37:32.491971 1195787 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:37:32.494842 1195787 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:37:32.494944 1195787 notify.go:221] Checking for updates...
	I1218 00:37:32.500434 1195787 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:37:32.503311 1195787 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:32.506071 1195787 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:37:32.508979 1195787 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:37:32.511873 1195787 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:37:32.515326 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:32.515476 1195787 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:37:32.549560 1195787 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:37:32.549709 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.608968 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.600331572 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.609068 1195787 docker.go:319] overlay module found
	I1218 00:37:32.612053 1195787 out.go:179] * Using the docker driver based on existing profile
	I1218 00:37:32.614859 1195787 start.go:309] selected driver: docker
	I1218 00:37:32.614879 1195787 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.614985 1195787 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:37:32.615081 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.681718 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.67244891 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.682130 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:32.682189 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:32.682255 1195787 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.687138 1195787 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:37:32.690134 1195787 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:37:32.693078 1195787 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:37:32.696069 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:32.696123 1195787 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:37:32.696143 1195787 cache.go:65] Caching tarball of preloaded images
	I1218 00:37:32.696183 1195787 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:37:32.696303 1195787 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:37:32.696317 1195787 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:37:32.696417 1195787 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:37:32.714975 1195787 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:37:32.714995 1195787 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:37:32.715013 1195787 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:37:32.715043 1195787 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:37:32.715099 1195787 start.go:364] duration metric: took 33.796µs to acquireMachinesLock for "functional-288604"
	I1218 00:37:32.715121 1195787 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:37:32.715131 1195787 fix.go:54] fixHost starting: 
	I1218 00:37:32.715395 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:32.731575 1195787 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:37:32.731606 1195787 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:37:32.734910 1195787 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:37:32.734955 1195787 machine.go:94] provisionDockerMachine start ...
	I1218 00:37:32.735034 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.751418 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.751747 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.751760 1195787 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:37:32.904326 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:32.904350 1195787 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:37:32.904413 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.933199 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.933525 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.933536 1195787 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:37:33.096692 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:33.096816 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.115124 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.115445 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.115466 1195787 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:37:33.272592 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:37:33.272617 1195787 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:37:33.272637 1195787 ubuntu.go:190] setting up certificates
	I1218 00:37:33.272647 1195787 provision.go:84] configureAuth start
	I1218 00:37:33.272712 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:33.291737 1195787 provision.go:143] copyHostCerts
	I1218 00:37:33.291803 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291863 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:37:33.291880 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291977 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:37:33.292105 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292127 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:37:33.292137 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292177 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:37:33.292274 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292300 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:37:33.292315 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292347 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:37:33.292433 1195787 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:37:33.397529 1195787 provision.go:177] copyRemoteCerts
	I1218 00:37:33.397646 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:37:33.397692 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.416603 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:33.523879 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:37:33.523950 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:37:33.540143 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:37:33.540204 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:37:33.557091 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:37:33.557194 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:37:33.573937 1195787 provision.go:87] duration metric: took 301.27685ms to configureAuth
	I1218 00:37:33.573963 1195787 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:37:33.574138 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:33.574247 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.591351 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.591663 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.591676 1195787 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:37:33.932454 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:37:33.932478 1195787 machine.go:97] duration metric: took 1.197515142s to provisionDockerMachine
	I1218 00:37:33.932490 1195787 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:37:33.932503 1195787 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:37:33.932581 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:37:33.932636 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.953296 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.060199 1195787 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:37:34.063627 1195787 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:37:34.063655 1195787 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:37:34.063660 1195787 command_runner.go:130] > VERSION_ID="12"
	I1218 00:37:34.063664 1195787 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:37:34.063680 1195787 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:37:34.063684 1195787 command_runner.go:130] > ID=debian
	I1218 00:37:34.063689 1195787 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:37:34.063694 1195787 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:37:34.063700 1195787 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:37:34.063783 1195787 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:37:34.063800 1195787 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:37:34.063810 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:37:34.063871 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:37:34.063955 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:37:34.063966 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:37:34.064048 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:37:34.064056 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:37:34.064100 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:37:34.071756 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:34.089207 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:37:34.106978 1195787 start.go:296] duration metric: took 174.472072ms for postStartSetup
	I1218 00:37:34.107054 1195787 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:37:34.107096 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.124265 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.224786 1195787 command_runner.go:130] > 12%
	I1218 00:37:34.224858 1195787 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:37:34.228879 1195787 command_runner.go:130] > 171G
	I1218 00:37:34.229324 1195787 fix.go:56] duration metric: took 1.514188493s for fixHost
	I1218 00:37:34.229353 1195787 start.go:83] releasing machines lock for "functional-288604", held for 1.514233177s
	I1218 00:37:34.229425 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:34.246154 1195787 ssh_runner.go:195] Run: cat /version.json
	I1218 00:37:34.246206 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.246451 1195787 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:37:34.246509 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.266363 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.276260 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.371623 1195787 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:37:34.371754 1195787 ssh_runner.go:195] Run: systemctl --version
	I1218 00:37:34.461010 1195787 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:37:34.461057 1195787 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:37:34.461077 1195787 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:37:34.461152 1195787 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:37:34.497659 1195787 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:37:34.501645 1195787 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:37:34.502005 1195787 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:37:34.502070 1195787 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:37:34.509755 1195787 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:37:34.509780 1195787 start.go:496] detecting cgroup driver to use...
	I1218 00:37:34.509811 1195787 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:37:34.509875 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:37:34.523916 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:37:34.536646 1195787 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:37:34.536736 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:37:34.551504 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:37:34.564054 1195787 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:37:34.675890 1195787 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:37:34.798642 1195787 docker.go:234] disabling docker service ...
	I1218 00:37:34.798703 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:37:34.813006 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:37:34.825087 1195787 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:37:34.942798 1195787 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:37:35.067868 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:37:35.088600 1195787 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:37:35.102366 1195787 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:37:35.103752 1195787 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:37:35.103819 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.113147 1195787 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:37:35.113241 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.122530 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.131393 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.140799 1195787 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:37:35.148737 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.157396 1195787 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.165643 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.174650 1195787 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:37:35.181215 1195787 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:37:35.182122 1195787 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:37:35.189136 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.306446 1195787 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:37:35.483449 1195787 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:37:35.483550 1195787 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:37:35.487145 1195787 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:37:35.487172 1195787 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:37:35.487179 1195787 command_runner.go:130] > Device: 0,72	Inode: 1642        Links: 1
	I1218 00:37:35.487186 1195787 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:35.487202 1195787 command_runner.go:130] > Access: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487220 1195787 command_runner.go:130] > Modify: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487225 1195787 command_runner.go:130] > Change: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487229 1195787 command_runner.go:130] >  Birth: -
	I1218 00:37:35.487254 1195787 start.go:564] Will wait 60s for crictl version
	I1218 00:37:35.487306 1195787 ssh_runner.go:195] Run: which crictl
	I1218 00:37:35.490344 1195787 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:37:35.490683 1195787 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:37:35.512944 1195787 command_runner.go:130] > Version:  0.1.0
	I1218 00:37:35.513232 1195787 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:37:35.513363 1195787 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:37:35.513391 1195787 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:37:35.515559 1195787 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:37:35.515677 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.541522 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.541589 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.541609 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.541630 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.541651 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.541672 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.541692 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.541720 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.541741 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.541768 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.541786 1195787 command_runner.go:130] >      static
	I1218 00:37:35.541805 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.541829 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.541856 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.541889 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.541915 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.541933 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.541952 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.541983 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.541999 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.543191 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.569029 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.569102 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.569122 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.569144 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.569164 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.569191 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.569210 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.569239 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.569267 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.569285 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.569302 1195787 command_runner.go:130] >      static
	I1218 00:37:35.569320 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.569347 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.569366 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.569384 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.569405 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.569429 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.569449 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.569467 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.569485 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.575974 1195787 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:37:35.578737 1195787 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:37:35.594362 1195787 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:37:35.598161 1195787 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:37:35.598363 1195787 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFir
mwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:37:35.598485 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:35.598543 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.635547 1195787 command_runner.go:130] > {
	I1218 00:37:35.635578 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.635584 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635591 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.635596 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635602 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.635605 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635609 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635623 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.635631 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.635634 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635639 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.635643 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635650 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635654 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635657 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635668 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.635672 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635677 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.635680 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635684 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635693 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.635701 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.635704 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635709 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.635712 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635719 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635723 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635731 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.635735 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635740 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.635743 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635747 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635758 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.635773 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.635777 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635781 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.635786 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.635790 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635793 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635795 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635802 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.635805 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635810 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.635815 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635823 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635830 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.635838 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.635841 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635845 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.635848 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635852 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635855 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635864 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635868 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635872 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635875 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635881 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.635885 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635890 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.635893 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635897 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635905 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.635912 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.635915 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635920 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.635926 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635930 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635934 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635938 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635941 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635944 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635947 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635954 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.635957 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635963 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.635966 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635970 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635978 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.635986 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.635989 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635993 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.635997 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636000 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636003 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636007 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636011 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636013 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636016 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636022 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.636027 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636032 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.636035 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636039 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636046 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.636054 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.636057 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636060 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.636064 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636073 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636077 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636079 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636086 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.636090 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636095 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.636098 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636102 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636110 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.636125 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.636133 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636137 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.636140 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636144 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636147 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636151 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636154 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636158 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636160 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636166 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.636170 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636175 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.636178 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636182 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636190 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.636197 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.636200 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636204 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.636208 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636211 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.636214 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636238 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636243 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.636251 1195787 command_runner.go:130] >     }
	I1218 00:37:35.636254 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.636256 1195787 command_runner.go:130] > }
	I1218 00:37:35.636431 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.636439 1195787 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:37:35.636495 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.658094 1195787 command_runner.go:130] > {
	I1218 00:37:35.658111 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.658115 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658124 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.658128 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658134 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.658137 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658141 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658151 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.658159 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.658163 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658167 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.658171 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658176 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658179 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658182 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658189 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.658192 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658198 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.658201 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658205 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658213 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.658222 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.658225 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658229 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.658233 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658242 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658250 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658262 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658269 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.658273 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658279 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.658282 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658286 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658294 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.658302 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.658305 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658309 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.658313 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.658317 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658321 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658323 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658330 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.658334 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658339 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.658344 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658348 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658356 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.658367 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.658370 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658374 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.658378 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658382 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658384 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658393 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658397 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658400 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658403 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658410 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.658413 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658425 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.658431 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658435 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658443 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.658455 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.658465 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658469 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.658472 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658476 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658479 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658483 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658487 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658490 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658493 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658499 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.658503 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658508 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.658511 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658515 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658523 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.658532 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.658535 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658539 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.658543 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658549 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658552 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658556 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658560 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658563 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658566 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658572 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.658577 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658582 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.658589 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658598 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658605 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.658613 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.658616 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658620 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.658624 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658628 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658631 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658642 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658650 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.658653 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658659 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.658662 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658666 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658677 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.658694 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.658697 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658701 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.658705 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658708 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658711 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658715 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658718 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658721 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658731 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.658734 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658739 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.658742 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658746 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658754 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.658761 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.658772 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658777 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.658781 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658784 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.658788 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658791 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658794 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.658798 1195787 command_runner.go:130] >     }
	I1218 00:37:35.658800 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.658803 1195787 command_runner.go:130] > }
	I1218 00:37:35.660205 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.660262 1195787 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:37:35.660279 1195787 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:37:35.660385 1195787 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:37:35.660470 1195787 ssh_runner.go:195] Run: crio config
	I1218 00:37:35.707278 1195787 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:37:35.707300 1195787 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:37:35.707307 1195787 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:37:35.707310 1195787 command_runner.go:130] > #
	I1218 00:37:35.707318 1195787 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:37:35.707324 1195787 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:37:35.707330 1195787 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:37:35.707346 1195787 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:37:35.707349 1195787 command_runner.go:130] > # reload'.
	I1218 00:37:35.707356 1195787 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:37:35.707362 1195787 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:37:35.707368 1195787 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:37:35.707383 1195787 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:37:35.707387 1195787 command_runner.go:130] > [crio]
	I1218 00:37:35.707393 1195787 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:37:35.707398 1195787 command_runner.go:130] > # containers images, in this directory.
	I1218 00:37:35.707595 1195787 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:37:35.707607 1195787 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:37:35.707620 1195787 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:37:35.707627 1195787 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:37:35.707631 1195787 command_runner.go:130] > # imagestore = ""
	I1218 00:37:35.707637 1195787 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:37:35.707643 1195787 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:37:35.707768 1195787 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:37:35.707777 1195787 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:37:35.707784 1195787 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:37:35.707788 1195787 command_runner.go:130] > # storage_option = [
	I1218 00:37:35.707935 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.707952 1195787 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:37:35.707959 1195787 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:37:35.707971 1195787 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:37:35.707978 1195787 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:37:35.707984 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:37:35.707990 1195787 command_runner.go:130] > # always happen on a node reboot
	I1218 00:37:35.708138 1195787 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:37:35.708160 1195787 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:37:35.708174 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:37:35.708183 1195787 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:37:35.708342 1195787 command_runner.go:130] > # version_file_persist = ""
	I1218 00:37:35.708354 1195787 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:37:35.708363 1195787 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:37:35.708367 1195787 command_runner.go:130] > # internal_wipe = true
	I1218 00:37:35.708381 1195787 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:37:35.708388 1195787 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:37:35.708503 1195787 command_runner.go:130] > # internal_repair = true
	I1218 00:37:35.708512 1195787 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:37:35.708519 1195787 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:37:35.708525 1195787 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:37:35.708671 1195787 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:37:35.708682 1195787 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:37:35.708686 1195787 command_runner.go:130] > [crio.api]
	I1218 00:37:35.708706 1195787 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:37:35.708833 1195787 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:37:35.708843 1195787 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:37:35.708997 1195787 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:37:35.709007 1195787 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:37:35.709013 1195787 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:37:35.709016 1195787 command_runner.go:130] > # stream_port = "0"
	I1218 00:37:35.709022 1195787 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:37:35.709150 1195787 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:37:35.709160 1195787 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:37:35.709282 1195787 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:37:35.709292 1195787 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:37:35.709298 1195787 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709420 1195787 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:37:35.709430 1195787 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:37:35.709436 1195787 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709440 1195787 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:37:35.709447 1195787 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:37:35.709453 1195787 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:37:35.709462 1195787 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:37:35.709593 1195787 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:37:35.709614 1195787 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709735 1195787 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:37:35.709746 1195787 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709864 1195787 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:37:35.709875 1195787 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:37:35.709881 1195787 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:37:35.709885 1195787 command_runner.go:130] > [crio.runtime]
	I1218 00:37:35.709891 1195787 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:37:35.709896 1195787 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:37:35.709907 1195787 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:37:35.709913 1195787 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:37:35.709917 1195787 command_runner.go:130] > # default_ulimits = [
	I1218 00:37:35.710017 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710026 1195787 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:37:35.710154 1195787 command_runner.go:130] > # no_pivot = false
	I1218 00:37:35.710163 1195787 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:37:35.710170 1195787 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:37:35.710300 1195787 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:37:35.710309 1195787 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:37:35.710323 1195787 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:37:35.710336 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710476 1195787 command_runner.go:130] > # conmon = ""
	I1218 00:37:35.710485 1195787 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:37:35.710492 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:37:35.710496 1195787 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:37:35.710508 1195787 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:37:35.710514 1195787 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:37:35.710521 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710524 1195787 command_runner.go:130] > # conmon_env = [
	I1218 00:37:35.710624 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710633 1195787 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:37:35.710639 1195787 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:37:35.710644 1195787 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:37:35.710648 1195787 command_runner.go:130] > # default_env = [
	I1218 00:37:35.710790 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710800 1195787 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:37:35.710816 1195787 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:37:35.710953 1195787 command_runner.go:130] > # selinux = false
	I1218 00:37:35.710964 1195787 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:37:35.710972 1195787 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:37:35.710977 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.710981 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.710993 1195787 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:37:35.710999 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711131 1195787 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:37:35.711142 1195787 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:37:35.711149 1195787 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:37:35.711162 1195787 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:37:35.711169 1195787 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:37:35.711174 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711345 1195787 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:37:35.711373 1195787 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:37:35.711401 1195787 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:37:35.711419 1195787 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:37:35.711456 1195787 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:37:35.711477 1195787 command_runner.go:130] > # blockio parameters.
	I1218 00:37:35.711667 1195787 command_runner.go:130] > # blockio_reload = false
	I1218 00:37:35.711706 1195787 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:37:35.711725 1195787 command_runner.go:130] > # irqbalance daemon.
	I1218 00:37:35.711743 1195787 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:37:35.711776 1195787 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:37:35.711801 1195787 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:37:35.711821 1195787 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:37:35.711855 1195787 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:37:35.711879 1195787 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:37:35.711898 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.712052 1195787 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:37:35.712092 1195787 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:37:35.712112 1195787 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:37:35.712133 1195787 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:37:35.712151 1195787 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:37:35.712187 1195787 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:37:35.712206 1195787 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:37:35.712253 1195787 command_runner.go:130] > # will be added.
	I1218 00:37:35.712276 1195787 command_runner.go:130] > # default_capabilities = [
	I1218 00:37:35.712420 1195787 command_runner.go:130] > # 	"CHOWN",
	I1218 00:37:35.712461 1195787 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:37:35.712541 1195787 command_runner.go:130] > # 	"FSETID",
	I1218 00:37:35.712631 1195787 command_runner.go:130] > # 	"FOWNER",
	I1218 00:37:35.712660 1195787 command_runner.go:130] > # 	"SETGID",
	I1218 00:37:35.712794 1195787 command_runner.go:130] > # 	"SETUID",
	I1218 00:37:35.712896 1195787 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:37:35.712994 1195787 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:37:35.713065 1195787 command_runner.go:130] > # 	"KILL",
	I1218 00:37:35.713149 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.713172 1195787 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:37:35.713258 1195787 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:37:35.713410 1195787 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:37:35.713489 1195787 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:37:35.713545 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.713716 1195787 command_runner.go:130] > default_sysctls = [
	I1218 00:37:35.713734 1195787 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:37:35.713949 1195787 command_runner.go:130] > ]
	I1218 00:37:35.713959 1195787 command_runner.go:130] > # List of devices on the host that a
	I1218 00:37:35.713966 1195787 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:37:35.713970 1195787 command_runner.go:130] > # allowed_devices = [
	I1218 00:37:35.713995 1195787 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:37:35.714000 1195787 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:37:35.714003 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714008 1195787 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:37:35.714016 1195787 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:37:35.714022 1195787 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:37:35.714028 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.714032 1195787 command_runner.go:130] > # additional_devices = [
	I1218 00:37:35.714035 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714040 1195787 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:37:35.714044 1195787 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:37:35.714048 1195787 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:37:35.714052 1195787 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:37:35.714056 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714062 1195787 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:37:35.714068 1195787 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:37:35.714077 1195787 command_runner.go:130] > # Defaults to false.
	I1218 00:37:35.714083 1195787 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:37:35.714089 1195787 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:37:35.714100 1195787 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:37:35.714537 1195787 command_runner.go:130] > # hooks_dir = [
	I1218 00:37:35.714675 1195787 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:37:35.714791 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.715258 1195787 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:37:35.715414 1195787 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:37:35.715601 1195787 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:37:35.715650 1195787 command_runner.go:130] > #
	I1218 00:37:35.715843 1195787 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:37:35.715943 1195787 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:37:35.716060 1195787 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:37:35.716083 1195787 command_runner.go:130] > #
	I1218 00:37:35.716111 1195787 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:37:35.716131 1195787 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:37:35.716166 1195787 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:37:35.716187 1195787 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:37:35.716204 1195787 command_runner.go:130] > #
	I1218 00:37:35.716248 1195787 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:37:35.716275 1195787 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:37:35.716306 1195787 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:37:35.717368 1195787 command_runner.go:130] > # pids_limit = -1
	I1218 00:37:35.717418 1195787 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:37:35.717442 1195787 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:37:35.717463 1195787 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:37:35.717499 1195787 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:37:35.717521 1195787 command_runner.go:130] > # log_size_max = -1
	I1218 00:37:35.717693 1195787 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:37:35.717720 1195787 command_runner.go:130] > # log_to_journald = false
	I1218 00:37:35.717752 1195787 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:37:35.717776 1195787 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:37:35.717810 1195787 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:37:35.717835 1195787 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:37:35.717855 1195787 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:37:35.717888 1195787 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:37:35.717911 1195787 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:37:35.717929 1195787 command_runner.go:130] > # read_only = false
	I1218 00:37:35.717949 1195787 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:37:35.717978 1195787 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:37:35.717999 1195787 command_runner.go:130] > # live configuration reload.
	I1218 00:37:35.718017 1195787 command_runner.go:130] > # log_level = "info"
	I1218 00:37:35.718039 1195787 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:37:35.718073 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.718091 1195787 command_runner.go:130] > # log_filter = ""
	I1218 00:37:35.718112 1195787 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718144 1195787 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:37:35.718167 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718189 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718221 1195787 command_runner.go:130] > # uid_mappings = ""
	I1218 00:37:35.718243 1195787 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718262 1195787 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:37:35.718280 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718311 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718343 1195787 command_runner.go:130] > # gid_mappings = ""
	I1218 00:37:35.718363 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:37:35.718395 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718420 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718442 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718481 1195787 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:37:35.718507 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:37:35.718529 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718561 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718589 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718607 1195787 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:37:35.718641 1195787 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:37:35.718665 1195787 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:37:35.718685 1195787 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:37:35.718717 1195787 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:37:35.718741 1195787 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:37:35.718762 1195787 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:37:35.718793 1195787 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:37:35.718814 1195787 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:37:35.718831 1195787 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:37:35.718851 1195787 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:37:35.718882 1195787 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:37:35.718907 1195787 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:37:35.718931 1195787 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:37:35.718965 1195787 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:37:35.718989 1195787 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:37:35.719009 1195787 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:37:35.719039 1195787 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:37:35.719315 1195787 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:37:35.719348 1195787 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:37:35.719365 1195787 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:37:35.719396 1195787 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:37:35.719423 1195787 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:37:35.719450 1195787 command_runner.go:130] > # pinns_path = ""
	I1218 00:37:35.719484 1195787 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:37:35.719510 1195787 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:37:35.719528 1195787 command_runner.go:130] > # enable_criu_support = true
	I1218 00:37:35.719563 1195787 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:37:35.719586 1195787 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:37:35.719602 1195787 command_runner.go:130] > # enable_pod_events = false
	I1218 00:37:35.719622 1195787 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:37:35.719651 1195787 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:37:35.719672 1195787 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:37:35.719690 1195787 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:37:35.719711 1195787 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:37:35.719747 1195787 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:37:35.719770 1195787 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:37:35.719795 1195787 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:37:35.719826 1195787 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:37:35.719849 1195787 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:37:35.719865 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.719885 1195787 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:37:35.719916 1195787 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:37:35.719938 1195787 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:37:35.719957 1195787 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:37:35.719973 1195787 command_runner.go:130] > #
	I1218 00:37:35.720002 1195787 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:37:35.720024 1195787 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:37:35.720041 1195787 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:37:35.720059 1195787 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:37:35.720096 1195787 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:37:35.720121 1195787 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:37:35.720139 1195787 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:37:35.720170 1195787 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:37:35.720190 1195787 command_runner.go:130] > # monitor_env = []
	I1218 00:37:35.720207 1195787 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:37:35.720256 1195787 command_runner.go:130] > # allowed_annotations = []
	I1218 00:37:35.720274 1195787 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:37:35.720279 1195787 command_runner.go:130] > # no_sync_log = false
	I1218 00:37:35.720284 1195787 command_runner.go:130] > # default_annotations = {}
	I1218 00:37:35.720288 1195787 command_runner.go:130] > # stream_websockets = false
	I1218 00:37:35.720292 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.720348 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.720360 1195787 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:37:35.720367 1195787 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:37:35.720386 1195787 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:37:35.720399 1195787 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:37:35.720403 1195787 command_runner.go:130] > #   in $PATH.
	I1218 00:37:35.720418 1195787 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:37:35.720433 1195787 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:37:35.720439 1195787 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:37:35.720444 1195787 command_runner.go:130] > #   state.
	I1218 00:37:35.720451 1195787 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:37:35.720460 1195787 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:37:35.720466 1195787 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:37:35.720473 1195787 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:37:35.720480 1195787 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:37:35.720496 1195787 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:37:35.720506 1195787 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:37:35.720513 1195787 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:37:35.720531 1195787 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:37:35.720543 1195787 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:37:35.720550 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:37:35.720566 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:37:35.720576 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:37:35.720582 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:37:35.720590 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:37:35.720628 1195787 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:37:35.720649 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:37:35.720665 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:37:35.720671 1195787 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:37:35.720679 1195787 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:37:35.720689 1195787 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:37:35.720706 1195787 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:37:35.720719 1195787 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:37:35.720733 1195787 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:37:35.720746 1195787 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:37:35.720754 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:37:35.720760 1195787 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:37:35.720764 1195787 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:37:35.720772 1195787 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:37:35.720777 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:37:35.720783 1195787 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:37:35.720795 1195787 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:37:35.720813 1195787 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:37:35.720825 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:37:35.720833 1195787 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:37:35.720849 1195787 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:37:35.720857 1195787 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:37:35.720865 1195787 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:37:35.720875 1195787 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:37:35.720882 1195787 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:37:35.720888 1195787 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:37:35.720897 1195787 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:37:35.720905 1195787 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:37:35.720932 1195787 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:37:35.720946 1195787 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:37:35.720960 1195787 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:37:35.720968 1195787 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:37:35.720975 1195787 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:37:35.720983 1195787 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:37:35.720995 1195787 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:37:35.721013 1195787 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:37:35.721023 1195787 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:37:35.721031 1195787 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:37:35.721033 1195787 command_runner.go:130] > #
	I1218 00:37:35.721038 1195787 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:37:35.721043 1195787 command_runner.go:130] > #
	I1218 00:37:35.721049 1195787 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:37:35.721058 1195787 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:37:35.721061 1195787 command_runner.go:130] > #
	I1218 00:37:35.721072 1195787 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:37:35.721082 1195787 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:37:35.721085 1195787 command_runner.go:130] > #
	I1218 00:37:35.721091 1195787 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:37:35.721097 1195787 command_runner.go:130] > # feature.
	I1218 00:37:35.721100 1195787 command_runner.go:130] > #
	I1218 00:37:35.721106 1195787 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:37:35.721112 1195787 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:37:35.721119 1195787 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:37:35.721125 1195787 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:37:35.721131 1195787 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:37:35.721141 1195787 command_runner.go:130] > #
	I1218 00:37:35.721147 1195787 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:37:35.721153 1195787 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:37:35.721158 1195787 command_runner.go:130] > #
	I1218 00:37:35.721164 1195787 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:37:35.721170 1195787 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:37:35.721177 1195787 command_runner.go:130] > #
	I1218 00:37:35.721183 1195787 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:37:35.721188 1195787 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:37:35.721192 1195787 command_runner.go:130] > # limitation.
	I1218 00:37:35.721196 1195787 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:37:35.721200 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:37:35.721204 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721215 1195787 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:37:35.721228 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721232 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721236 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721241 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721248 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721251 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721255 1195787 command_runner.go:130] > allowed_annotations = [
	I1218 00:37:35.721261 1195787 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:37:35.721265 1195787 command_runner.go:130] > ]
	I1218 00:37:35.721270 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721274 1195787 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:37:35.721279 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:37:35.721282 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721289 1195787 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:37:35.721293 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721307 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721312 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721316 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721320 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721325 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721331 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721339 1195787 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:37:35.721347 1195787 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:37:35.721353 1195787 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:37:35.721361 1195787 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:37:35.721384 1195787 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:37:35.721399 1195787 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:37:35.721406 1195787 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:37:35.721417 1195787 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:37:35.721427 1195787 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:37:35.721438 1195787 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:37:35.721444 1195787 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:37:35.721457 1195787 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:37:35.721461 1195787 command_runner.go:130] > # Example:
	I1218 00:37:35.721466 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:37:35.721472 1195787 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:37:35.721477 1195787 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:37:35.721487 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:37:35.721490 1195787 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:37:35.721494 1195787 command_runner.go:130] > # cpushares = "5"
	I1218 00:37:35.721498 1195787 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:37:35.721502 1195787 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:37:35.721507 1195787 command_runner.go:130] > # cpulimit = "35"
	I1218 00:37:35.721510 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.721516 1195787 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:37:35.721524 1195787 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:37:35.721529 1195787 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:37:35.721535 1195787 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:37:35.721544 1195787 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:37:35.721552 1195787 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:37:35.721556 1195787 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:37:35.721563 1195787 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:37:35.721568 1195787 command_runner.go:130] > # Default value is set to true
	I1218 00:37:35.721574 1195787 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:37:35.721580 1195787 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:37:35.721588 1195787 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:37:35.721592 1195787 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:37:35.721621 1195787 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:37:35.721627 1195787 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:37:35.721635 1195787 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:37:35.721640 1195787 command_runner.go:130] > # timezone = ""
	I1218 00:37:35.721647 1195787 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:37:35.721650 1195787 command_runner.go:130] > #
	I1218 00:37:35.721656 1195787 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:37:35.721665 1195787 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:37:35.721672 1195787 command_runner.go:130] > [crio.image]
	I1218 00:37:35.721679 1195787 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:37:35.721683 1195787 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:37:35.721689 1195787 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:37:35.721701 1195787 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721706 1195787 command_runner.go:130] > # global_auth_file = ""
	I1218 00:37:35.721711 1195787 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:37:35.721723 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721728 1195787 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.721738 1195787 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:37:35.721745 1195787 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721754 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721758 1195787 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:37:35.721764 1195787 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:37:35.721769 1195787 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:37:35.721776 1195787 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:37:35.721781 1195787 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:37:35.721787 1195787 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:37:35.721793 1195787 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:37:35.721799 1195787 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:37:35.721805 1195787 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:37:35.721813 1195787 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:37:35.721819 1195787 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:37:35.721825 1195787 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:37:35.721831 1195787 command_runner.go:130] > # pinned_images = [
	I1218 00:37:35.721834 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721840 1195787 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:37:35.721846 1195787 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:37:35.721853 1195787 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:37:35.721859 1195787 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:37:35.721866 1195787 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:37:35.721871 1195787 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:37:35.721879 1195787 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:37:35.721892 1195787 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:37:35.721901 1195787 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:37:35.721912 1195787 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:37:35.721918 1195787 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:37:35.721923 1195787 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:37:35.721928 1195787 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:37:35.721935 1195787 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:37:35.721938 1195787 command_runner.go:130] > # changing them here.
	I1218 00:37:35.721944 1195787 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:37:35.721955 1195787 command_runner.go:130] > # insecure_registries = [
	I1218 00:37:35.721957 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721964 1195787 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:37:35.721969 1195787 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:37:35.721977 1195787 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:37:35.721983 1195787 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:37:35.721987 1195787 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:37:35.721998 1195787 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:37:35.722005 1195787 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:37:35.722009 1195787 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:37:35.722015 1195787 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:37:35.722024 1195787 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:37:35.722031 1195787 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:37:35.722036 1195787 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:37:35.722048 1195787 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:37:35.722054 1195787 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:37:35.722062 1195787 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:37:35.722070 1195787 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:37:35.722074 1195787 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:37:35.722081 1195787 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:37:35.722087 1195787 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:37:35.722091 1195787 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:37:35.722097 1195787 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:37:35.722108 1195787 command_runner.go:130] > # CNI plugins.
	I1218 00:37:35.722117 1195787 command_runner.go:130] > [crio.network]
	I1218 00:37:35.722131 1195787 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:37:35.722136 1195787 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:37:35.722142 1195787 command_runner.go:130] > # cni_default_network = ""
	I1218 00:37:35.722148 1195787 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:37:35.722156 1195787 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:37:35.722162 1195787 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:37:35.722165 1195787 command_runner.go:130] > # plugin_dirs = [
	I1218 00:37:35.722169 1195787 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:37:35.722172 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722176 1195787 command_runner.go:130] > # List of included pod metrics.
	I1218 00:37:35.722180 1195787 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:37:35.722182 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722190 1195787 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:37:35.722196 1195787 command_runner.go:130] > [crio.metrics]
	I1218 00:37:35.722201 1195787 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:37:35.722205 1195787 command_runner.go:130] > # enable_metrics = false
	I1218 00:37:35.722209 1195787 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:37:35.722215 1195787 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:37:35.722222 1195787 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:37:35.722233 1195787 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:37:35.722239 1195787 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:37:35.722243 1195787 command_runner.go:130] > # metrics_collectors = [
	I1218 00:37:35.722247 1195787 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:37:35.722252 1195787 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:37:35.722256 1195787 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:37:35.722260 1195787 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:37:35.722266 1195787 command_runner.go:130] > # 	"operations_total",
	I1218 00:37:35.722270 1195787 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:37:35.722275 1195787 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:37:35.722279 1195787 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:37:35.722283 1195787 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:37:35.722287 1195787 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:37:35.722295 1195787 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:37:35.722299 1195787 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:37:35.722312 1195787 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:37:35.722316 1195787 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:37:35.722321 1195787 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:37:35.722325 1195787 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:37:35.722329 1195787 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:37:35.722332 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722338 1195787 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:37:35.722342 1195787 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:37:35.722347 1195787 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:37:35.722351 1195787 command_runner.go:130] > # metrics_port = 9090
	I1218 00:37:35.722358 1195787 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:37:35.722362 1195787 command_runner.go:130] > # metrics_socket = ""
	I1218 00:37:35.722377 1195787 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:37:35.722386 1195787 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:37:35.722398 1195787 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:37:35.722403 1195787 command_runner.go:130] > # certificate on any modification event.
	I1218 00:37:35.722406 1195787 command_runner.go:130] > # metrics_cert = ""
	I1218 00:37:35.722411 1195787 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:37:35.722421 1195787 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:37:35.722424 1195787 command_runner.go:130] > # metrics_key = ""
	I1218 00:37:35.722433 1195787 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:37:35.722437 1195787 command_runner.go:130] > [crio.tracing]
	I1218 00:37:35.722445 1195787 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:37:35.722451 1195787 command_runner.go:130] > # enable_tracing = false
	I1218 00:37:35.722464 1195787 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:37:35.722472 1195787 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:37:35.722479 1195787 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:37:35.722485 1195787 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:37:35.722490 1195787 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:37:35.722493 1195787 command_runner.go:130] > [crio.nri]
	I1218 00:37:35.722498 1195787 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:37:35.722507 1195787 command_runner.go:130] > # enable_nri = true
	I1218 00:37:35.722519 1195787 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:37:35.722524 1195787 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:37:35.722528 1195787 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:37:35.722539 1195787 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:37:35.722544 1195787 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:37:35.722549 1195787 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:37:35.722557 1195787 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:37:35.722613 1195787 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:37:35.722623 1195787 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:37:35.722628 1195787 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:37:35.722634 1195787 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:37:35.722640 1195787 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:37:35.722645 1195787 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:37:35.722651 1195787 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:37:35.722658 1195787 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:37:35.722663 1195787 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:37:35.722666 1195787 command_runner.go:130] > # - OCI hook injection
	I1218 00:37:35.722671 1195787 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:37:35.722677 1195787 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:37:35.722683 1195787 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:37:35.722689 1195787 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:37:35.722696 1195787 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:37:35.722702 1195787 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:37:35.722709 1195787 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:37:35.722712 1195787 command_runner.go:130] > #
	I1218 00:37:35.722717 1195787 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:37:35.722724 1195787 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:37:35.722729 1195787 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:37:35.722734 1195787 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:37:35.722739 1195787 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:37:35.722744 1195787 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:37:35.722749 1195787 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:37:35.722759 1195787 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:37:35.722765 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722771 1195787 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:37:35.722777 1195787 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:37:35.722788 1195787 command_runner.go:130] > [crio.stats]
	I1218 00:37:35.722797 1195787 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:37:35.722805 1195787 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:37:35.722809 1195787 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:37:35.722814 1195787 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:37:35.722821 1195787 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:37:35.722825 1195787 command_runner.go:130] > # collection_period = 0
	I1218 00:37:35.722870 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686277403Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:37:35.722885 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686455769Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:37:35.722906 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686635242Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:37:35.722915 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686725939Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:37:35.722930 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686860827Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.722940 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.687143526Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:37:35.722954 1195787 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:37:35.723070 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:35.723084 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:35.723105 1195787 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:37:35.723135 1195787 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath
:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:37:35.723264 1195787 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:37:35.723342 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:37:35.730799 1195787 command_runner.go:130] > kubeadm
	I1218 00:37:35.730815 1195787 command_runner.go:130] > kubectl
	I1218 00:37:35.730820 1195787 command_runner.go:130] > kubelet
	I1218 00:37:35.730852 1195787 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:37:35.730903 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:37:35.737892 1195787 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:37:35.749699 1195787 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:37:35.761635 1195787 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2219 bytes)
	I1218 00:37:35.773650 1195787 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:37:35.777155 1195787 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:37:35.777265 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.913809 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:36.641224 1195787 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:37:36.641246 1195787 certs.go:195] generating shared ca certs ...
	I1218 00:37:36.641263 1195787 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:36.641410 1195787 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:37:36.641464 1195787 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:37:36.641475 1195787 certs.go:257] generating profile certs ...
	I1218 00:37:36.641577 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:37:36.641667 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:37:36.641711 1195787 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:37:36.641724 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:37:36.641737 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:37:36.641753 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:37:36.641763 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:37:36.641780 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:37:36.641792 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:37:36.641807 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:37:36.641818 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:37:36.641873 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:37:36.641907 1195787 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:37:36.641920 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:37:36.641952 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:37:36.641982 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:37:36.642014 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:37:36.642068 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:36.642106 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:37:36.642122 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.642133 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.642704 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:37:36.662928 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:37:36.685489 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:37:36.708038 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:37:36.726679 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:37:36.744109 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:37:36.760724 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:37:36.777802 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:37:36.794736 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:37:36.811089 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:37:36.827838 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:37:36.844718 1195787 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:37:36.856626 1195787 ssh_runner.go:195] Run: openssl version
	I1218 00:37:36.862122 1195787 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:37:36.862595 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.869813 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:37:36.876968 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880287 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880319 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880364 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.920445 1195787 command_runner.go:130] > b5213941
	I1218 00:37:36.920887 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:37:36.928015 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.934857 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:37:36.941992 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945456 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945522 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945583 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.985712 1195787 command_runner.go:130] > 51391683
	I1218 00:37:36.986191 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:37:36.993294 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.001803 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:37:37.011590 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.016819 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017267 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017348 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.061113 1195787 command_runner.go:130] > 3ec20f2e
	I1218 00:37:37.061606 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:37:37.068668 1195787 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072025 1195787 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072050 1195787 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:37:37.072057 1195787 command_runner.go:130] > Device: 259,1	Inode: 1326178     Links: 1
	I1218 00:37:37.072063 1195787 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:37.072070 1195787 command_runner.go:130] > Access: 2025-12-18 00:33:28.828061434 +0000
	I1218 00:37:37.072075 1195787 command_runner.go:130] > Modify: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072080 1195787 command_runner.go:130] > Change: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072086 1195787 command_runner.go:130] >  Birth: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072155 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:37:37.111978 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.112489 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:37:37.152999 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.153074 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:37:37.194884 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.195292 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:37:37.235218 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.235658 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:37:37.275710 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.276177 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:37:37.316082 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.316486 1195787 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwa
rePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:37.316593 1195787 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:37:37.316685 1195787 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:37:37.341722 1195787 cri.go:89] found id: ""
	I1218 00:37:37.341828 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:37:37.348335 1195787 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:37:37.348357 1195787 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:37:37.348372 1195787 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:37:37.349183 1195787 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:37:37.349197 1195787 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:37:37.349253 1195787 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:37:37.356307 1195787 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:37:37.356734 1195787 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.356836 1195787 kubeconfig.go:62] /home/jenkins/minikube-integration/22186-1156339/kubeconfig needs updating (will repair): [kubeconfig missing "functional-288604" cluster setting kubeconfig missing "functional-288604" context setting]
	I1218 00:37:37.357097 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.357514 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.357675 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.358178 1195787 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:37:37.358185 1195787 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:37:37.358343 1195787 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:37:37.358365 1195787 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:37:37.358389 1195787 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:37:37.358400 1195787 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:37:37.358747 1195787 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:37:37.366250 1195787 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:37:37.366287 1195787 kubeadm.go:602] duration metric: took 17.084351ms to restartPrimaryControlPlane
	I1218 00:37:37.366297 1195787 kubeadm.go:403] duration metric: took 49.819997ms to StartCluster
	I1218 00:37:37.366310 1195787 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.366369 1195787 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.366947 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.367145 1195787 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:37:37.367532 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:37.367580 1195787 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:37:37.367705 1195787 addons.go:70] Setting storage-provisioner=true in profile "functional-288604"
	I1218 00:37:37.367724 1195787 addons.go:239] Setting addon storage-provisioner=true in "functional-288604"
	I1218 00:37:37.367744 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.368436 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.368583 1195787 addons.go:70] Setting default-storageclass=true in profile "functional-288604"
	I1218 00:37:37.368601 1195787 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-288604"
	I1218 00:37:37.368944 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.373199 1195787 out.go:179] * Verifying Kubernetes components...
	I1218 00:37:37.376080 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:37.397822 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.397983 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.398246 1195787 addons.go:239] Setting addon default-storageclass=true in "functional-288604"
	I1218 00:37:37.398278 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.398894 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.407451 1195787 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:37:37.410300 1195787 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.410322 1195787 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:37:37.410384 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.434096 1195787 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:37.434117 1195787 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:37:37.434174 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.457842 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.477819 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.583963 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:37.618382 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.637024 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.392142 1195787 node_ready.go:35] waiting up to 6m0s for node "functional-288604" to be "Ready" ...
	I1218 00:37:38.392289 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.392356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.392602 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392638 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392662 1195787 retry.go:31] will retry after 293.380468ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392710 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392727 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392733 1195787 retry.go:31] will retry after 283.333163ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:38.676355 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.686660 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:38.750557 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753745 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753775 1195787 retry.go:31] will retry after 508.906429ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753840 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753899 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753916 1195787 retry.go:31] will retry after 283.918132ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.893115 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.893199 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.893535 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.038817 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.092066 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.095485 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.095518 1195787 retry.go:31] will retry after 317.14343ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.262906 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.318327 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.322166 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.322196 1195787 retry.go:31] will retry after 611.398612ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.392378 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.413200 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.474250 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.474288 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.474326 1195787 retry.go:31] will retry after 551.991324ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.892368 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.892757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.933930 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.991113 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.991153 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.991172 1195787 retry.go:31] will retry after 590.272449ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.027415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:40.085906 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.089482 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.089515 1195787 retry.go:31] will retry after 1.798316027s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.392931 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.393007 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.393310 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:40.393376 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:40.582668 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:40.643859 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.643900 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.643941 1195787 retry.go:31] will retry after 1.196819353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.892768 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.392495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.841577 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:41.888099 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:41.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.901267 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.901306 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.901323 1195787 retry.go:31] will retry after 1.106575841s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948402 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.948447 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948500 1195787 retry.go:31] will retry after 1.314106681s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:42.393054 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.393195 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.393477 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:42.393524 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:42.893249 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.893594 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.008894 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:43.066157 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.066194 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.066212 1195787 retry.go:31] will retry after 2.952953914s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.263490 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:43.325047 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.325147 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.325201 1195787 retry.go:31] will retry after 2.165088511s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.892529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.892853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.392698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.892859 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:44.892927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:45.392615 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.393055 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:45.491313 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:45.548834 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:45.552259 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.552290 1195787 retry.go:31] will retry after 4.009218302s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.020180 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:46.081331 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:46.081373 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.081392 1195787 retry.go:31] will retry after 2.724964309s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.392810 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.392886 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.393216 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.893121 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.893451 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:46.893527 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:47.392312 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.392379 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.392690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:47.892435 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.892508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.806570 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:48.859925 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:48.863450 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.863523 1195787 retry.go:31] will retry after 5.125713123s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.892640 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.892710 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.892972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:49.392419 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.392858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:49.392930 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:49.562244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:49.616549 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:49.619912 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.619976 1195787 retry.go:31] will retry after 7.525324152s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.893380 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.893476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.893792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.392521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.892413 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.392501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:51.892896 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:52.392394 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:52.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.892886 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.892492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.990244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:54.052144 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:54.052189 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.052212 1195787 retry.go:31] will retry after 10.028215297s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:54.392879 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:54.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.892892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.892535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:56.892936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:57.146448 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:57.223873 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:57.223911 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.223929 1195787 retry.go:31] will retry after 7.68443688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.392441 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.392757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:57.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.892896 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.892496 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.892902 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:58.892976 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:59.392373 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.392729 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:59.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.392605 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.392823 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.393374 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.893180 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.893259 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.893590 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:00.893634 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:01.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.392775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:01.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.892560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.392629 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.393091 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.893045 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.893120 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.893431 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:03.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.393360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.393682 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:03.393752 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:03.892452 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.892588 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.081415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:04.149098 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.149154 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.149173 1195787 retry.go:31] will retry after 12.181474759s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.392826 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.892706 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.908952 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:04.983582 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.983679 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.983707 1195787 retry.go:31] will retry after 20.674508131s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:05.393152 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.393222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.393548 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:05.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:05.892840 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:06.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:06.892466 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.892581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.392796 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.392870 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.393185 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.893008 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.893099 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.893411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:07.893460 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:08.393200 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.393269 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.393580 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:08.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.392350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:10.392470 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.392542 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:10.392885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:10.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.392567 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.392748 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.392724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.892526 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.892600 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.892927 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:12.892994 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:13.392672 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.393083 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:13.892350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.892423 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:15.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.392797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:15.392868 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:15.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.331590 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:16.385966 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:16.389831 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.389871 1195787 retry.go:31] will retry after 10.81475415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.393176 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.393493 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.893314 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.893409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.893794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:17.392528 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.392670 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:17.393070 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:17.892892 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.892977 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.893319 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.393167 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.393296 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:19.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.392531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.393011 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:19.393093 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:19.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.892493 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.392457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.392540 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.892404 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.892505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.892840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.392336 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.392752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:21.892899 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:22.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.392824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:22.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.892650 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.892812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:24.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.392459 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.392739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:24.392822 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:24.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.392505 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.392578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.392919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.658449 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:25.718689 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:25.718787 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.718811 1195787 retry.go:31] will retry after 20.411460434s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.893032 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.893152 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.893496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:26.393268 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.393345 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.393658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:26.393735 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:26.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.205308 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:27.264744 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:27.264795 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.264821 1195787 retry.go:31] will retry after 26.872581906s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.393247 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.393343 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.393691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.392400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.392499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.892880 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:28.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:29.392435 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.392530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:29.892518 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.892615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.892959 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.392483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.892684 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:30.893088 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:31.392443 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.392538 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:31.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.392398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.892495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:33.392363 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.392786 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:33.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:33.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.892942 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:35.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:35.392919 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:35.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.892552 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.892909 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:37.392676 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.392747 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.393087 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:37.393163 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:37.892815 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.892918 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.893191 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.392972 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.393069 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.393395 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.893206 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.893283 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.893614 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.892914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:39.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:40.392478 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:40.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.392429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.892557 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.892907 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:42.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.392468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:42.392895 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.892798 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.893169 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.392965 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.393041 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.393425 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.893065 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.893192 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.893542 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:44.393319 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.393457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.393797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:44.393864 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:44.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.892887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.392407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.392691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.892831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.131350 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:46.207148 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:46.207192 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.207211 1195787 retry.go:31] will retry after 46.493082425s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.392632 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.393042 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.892356 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.892439 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:46.892784 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:47.392561 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.392655 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.393022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:47.892957 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.893052 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.393028 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.393151 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.393502 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.893231 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.893339 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:48.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:49.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.892410 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.892719 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.392567 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.392922 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.892639 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.892718 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.893017 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:51.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:51.392927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:51.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.392587 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.392689 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.393078 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.892911 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.893278 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:53.393104 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.393175 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.393506 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:53.393578 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:53.893174 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.893253 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.893635 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.138097 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:54.199604 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:54.199639 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.199657 1195787 retry.go:31] will retry after 32.999586692s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.392915 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.392997 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.393320 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.893151 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.893222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.893558 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:55.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.393372 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.393696 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:55.393771 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:55.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.392482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.392536 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:57.892898 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:58.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.392510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.392842 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.892427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.892690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.392771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:00.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.392852 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:00.392906 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:00.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.392554 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.392642 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.392932 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:02.392460 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:02.392937 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:02.893082 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.893161 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.893517 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.393213 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.393335 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.393710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.893298 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.893393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.893695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.392345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.392774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.892322 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.892394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:04.892799 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:05.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:05.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.892687 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.893007 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.392320 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.392389 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.392734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:06.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:07.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.392661 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.393015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:07.892841 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.892966 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.893308 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.393076 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.393143 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.393465 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.893245 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.893642 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:08.893706 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:09.392340 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:09.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.392603 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.393041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.892396 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.892674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:11.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.392788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:11.392854 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:11.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.892864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.392694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:13.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.392558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:13.392904 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:13.892358 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.392462 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.892493 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.892568 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.892916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.392728 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:15.892902 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:16.392592 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.393051 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:16.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.892766 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.392504 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.392612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.392914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.892926 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.893001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:17.893380 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:18.393186 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.393287 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.393589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:18.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.892436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.892724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.892501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:20.392577 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.392677 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:20.393034 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:20.892700 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.892780 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.893096 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.392884 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.392972 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.393246 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.893036 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.893115 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.893439 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:22.393105 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.393180 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.393531 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:22.393602 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:22.893283 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.893357 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.893654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.392360 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.392759 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.892410 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.892763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.392324 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.392671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.892367 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:24.892851 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:25.392405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.392832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:25.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.892598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:26.893059 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:27.199423 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:39:27.258871 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262674 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262814 1195787 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:27.393144 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.393338 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.393739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:27.892445 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.892515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.392686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.393001 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.892388 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:29.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:29.392759 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:29.892449 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.892575 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.892898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.892522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:31.392593 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.392698 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.393057 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:31.393175 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.892746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.393076 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.700811 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:39:32.759422 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759519 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759610 1195787 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:32.762569 1195787 out.go:179] * Enabled addons: 
	I1218 00:39:32.766452 1195787 addons.go:530] duration metric: took 1m55.398865574s for enable addons: enabled=[]
	I1218 00:39:32.892720 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.892834 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.893134 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:33.392885 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.392951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.393266 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:33.393360 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:33.893120 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.893193 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.893559 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.393379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.393480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.393839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.892868 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.392614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.892537 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:35.892942 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:36.392383 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.392802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:36.892719 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.892802 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.893187 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.393130 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.393210 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.393536 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.892374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:38.392658 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.392729 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:38.393158 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:38.892929 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.893353 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.393129 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.393197 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.393452 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.893242 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.893317 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.893673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.392517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.892369 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.892525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:40.892938 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:41.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.392798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:41.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.892875 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.392446 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.892629 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.892701 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.893027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:42.893091 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:43.392773 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.392853 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.393188 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:43.892986 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.893071 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.893334 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.393187 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.393481 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.893282 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.893350 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.893683 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:44.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:45.392327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.392700 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:45.892412 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.392830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.892694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:47.392541 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:47.392943 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:47.892913 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.892990 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.392935 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.393001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.393289 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.893086 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.893157 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.893470 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:49.393330 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.393420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.393740 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:49.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:49.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.892391 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.892665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.392473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.892447 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.392762 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.892406 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:51.892823 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:52.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:52.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.392422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.392736 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.892489 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.892612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:53.893004 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:54.392376 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:54.892366 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.892778 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:56.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:56.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:56.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.392473 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.892694 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.892772 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.892739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:58.892789 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:59.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.392532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:59.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.892633 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.392899 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.892419 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:00.892885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:01.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:01.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.892996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.392436 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.392515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.892688 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.892766 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:02.893136 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:03.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:03.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.392791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.892671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:05.392381 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:05.392855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:05.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:07.392557 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.392630 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.392948 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:07.393044 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:07.892947 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.893292 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.393116 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.393189 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.393503 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.893257 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.893330 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.893657 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:09.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.393365 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.393623 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:09.393667 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:09.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.892429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.392530 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.892422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.892749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.892544 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.892620 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:11.893010 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:12.392339 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.392715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:12.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.892814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:14.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.392813 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:14.392867 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:14.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.392448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.892562 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.892645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.892993 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:16.392714 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.392789 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.393108 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:16.393181 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:16.892981 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.893057 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.893318 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.393280 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.393362 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.393715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.392754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.892443 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.892891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:18.892951 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:19.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.392715 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.393048 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:19.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.392464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.392845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:21.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:21.392721 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:21.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.392527 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:23.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:23.392882 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:23.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.392684 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:25.392518 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.392622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.393004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:25.393058 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:25.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.892660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.392355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.892589 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.392982 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.892845 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.892928 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.893247 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:27.893296 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:28.392780 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.392854 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.393198 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:28.892980 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.893051 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.893305 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.393025 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.393097 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.393406 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.893167 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.893246 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.893569 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:29.893625 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:30.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.392419 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.392749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:30.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.892844 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.392817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.892407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:32.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.392882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:32.392936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:32.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.892506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.392470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.892473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:34.892850 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:35.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.392811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:35.892523 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.892622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.892978 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.392672 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.892754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.392604 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:37.393019 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:37.892964 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.893061 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.893356 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.393199 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.393280 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.393633 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.892784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.392710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.892802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:39.892855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:40.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:40.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.892649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.892534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:42.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.392664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:42.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.892597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.392651 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.392720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.393047 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.892468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.892851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:44.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.392461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:44.392853 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:44.892484 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.892559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.892919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.392680 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.892767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.892679 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:46.892729 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:47.392534 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.392616 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:47.893009 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.893086 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.893414 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.393173 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.393243 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.393496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.893310 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.893387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.893701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:48.893755 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:49.392365 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.392767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:49.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.892412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.392433 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:51.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.392457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:51.392745 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:51.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.392597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.392916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.892836 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.892908 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.893174 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:53.393008 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.393079 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.393392 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:53.393445 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:53.893170 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.893250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.893577 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.393256 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.393323 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.393624 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.892833 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.892909 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.893230 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.392807 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.392879 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.393195 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.892903 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.892983 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.893248 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:55.893295 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:56.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.393164 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.393494 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:56.893233 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.893303 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.893625 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.392308 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.892561 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.892640 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.893004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:58.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.392770 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:58.392830 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:58.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.392417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.392375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.392732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:00.892926 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:01.392600 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.392680 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.393027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:01.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.392459 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.392534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.392891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.892379 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:03.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.392673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:03.392717 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:03.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.892520 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.892803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:05.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:05.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:05.892501 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.892578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.892871 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.392458 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.392846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.892564 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.892651 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:07.392849 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.392927 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.393249 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:07.393300 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:07.893061 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.893141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.893407 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.393571 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.893213 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.893285 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.893586 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:09.393338 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.393409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.393737 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:09.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:09.892310 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.892384 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.892460 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.892531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.392821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.892452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:11.892845 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:12.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.392447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:12.892659 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.893041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.392855 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.892343 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:14.392354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.392431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:14.392815 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:14.892477 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.892893 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:16.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.392508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:16.392886 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:16.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.392663 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.393122 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.893034 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.893112 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.893434 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:18.393225 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.393298 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.393566 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:18.393619 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:18.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.892776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.392463 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.392545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:20.892875 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:21.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.392697 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:21.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.392426 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.892691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:23.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:23.392897 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:23.892549 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.892621 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.892930 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.392371 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:25.392487 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.392571 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.392908 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:25.392969 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:25.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.892701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.392465 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.392539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.892616 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.892694 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.892997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:27.893042 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:28.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:28.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.392438 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.892380 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:30.392322 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:30.392775 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:30.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.392484 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.392563 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.392894 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:32.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:32.392859 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:32.892645 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.892720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.893273 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.393053 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.393128 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.393383 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.893197 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.893271 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.893602 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.392329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.892409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:34.892764 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:35.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:35.892454 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.392716 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:36.892870 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:37.392455 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.392936 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:37.893011 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.893129 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.893427 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.393291 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.393628 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.893349 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.893718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:38.893795 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:39.393265 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.393337 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.393588 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:39.893331 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.893408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.893734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.892710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:41.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:41.392842 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:41.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.892453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.892740 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.893071 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:43.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.392714 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.393030 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:43.393087 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:43.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.892423 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.892741 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:45.892958 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:46.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:46.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.892695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.392510 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.392586 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.392951 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:48.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.392712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:48.392768 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:48.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.392787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.892646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:50.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:50.392909 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:50.892579 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.892652 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.892985 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.392709 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.392792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.892354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:52.892725 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:53.392414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.392831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:53.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:54.892908 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:55.392584 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.392665 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.392996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:55.892326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.892800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.392401 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.392474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.892944 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:56.892999 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:57.392855 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.392925 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:57.893155 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.893229 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.893537 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.393357 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.393431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.393713 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:59.392495 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.392587 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.392917 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:59.392973 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:59.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.893052 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:01.392764 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.392860 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:01.393227 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:01.892948 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.893270 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.393163 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.393444 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.892430 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.892742 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.892755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:03.892801 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:04.392448 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.392523 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:04.892336 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.892677 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:05.892848 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:06.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:06.892407 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.892929 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.392690 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.392765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.393085 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.893075 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.893147 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:07.893442 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:08.393253 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.393325 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.393644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:08.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.392738 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:10.392423 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:10.392894 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:10.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.392445 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.392851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.892553 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.892632 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.892973 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.392666 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:12.892876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:13.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.392815 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:13.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.392377 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:14.892953 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:15.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.392701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:15.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.392442 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.392833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.892667 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:17.392522 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.392590 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:17.392921 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:17.892652 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.892728 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.893037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.892817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.392472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.892338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:19.892782 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:20.392466 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:20.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:21.892849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:22.392532 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.392615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.392957 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:22.892669 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.892988 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.392497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.892580 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.892912 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:23.892972 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:24.392362 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.392433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.392780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:24.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.892471 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.392469 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.392543 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:26.392450 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.392522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.392876 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:26.392935 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:26.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.892686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.392919 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.392992 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.393253 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.893175 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.893249 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.893584 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.392338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.892451 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:28.892862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:29.392512 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.392870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:29.892578 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.892656 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.392314 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.893263 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.893356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.893732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:30.893797 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:31.392476 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.392560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:31.892342 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.892698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.392782 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:33.392366 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.392818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:33.392876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:33.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.892447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.392507 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.392934 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.892413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:35.392488 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:35.392932 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:35.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.892504 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.392359 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.392660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.393104 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:37.393155 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:37.892886 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.892951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.893194 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.392988 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.393067 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.393390 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.893201 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.893273 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.893589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:39.393344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.393415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.393662 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:39.393702 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:39.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.392451 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.392529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.392862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.892870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:42.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:42.892516 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.892611 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.892938 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.392638 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.392711 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.393037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.892699 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.892765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:43.893055 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:44.392565 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.392645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.392956 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:44.892656 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.892731 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.893058 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.392581 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.392846 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.393290 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.893123 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.893447 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:45.893502 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:46.393297 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.393390 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.393780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:46.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.892799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.392560 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.392631 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.392960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.892485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:48.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.392663 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:48.392703 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:48.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.392544 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.392943 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:50.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:50.392849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:50.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.392654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.892315 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.892388 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.892718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:52.392427 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:52.392889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:52.892324 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.892546 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.892874 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.392387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.892485 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.892882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:54.892940 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:55.392636 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.393066 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:55.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:57.392739 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.392818 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.393074 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:57.393125 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:57.892970 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.893044 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.893385 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.393560 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.893294 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.893360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.893621 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.892745 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:59.892790 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:00.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.392421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:00.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.892467 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.392584 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:02.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.392819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:02.392871 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:02.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.392644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.392425 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.892634 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:04.892673 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:05.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:05.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.892809 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.392658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:06.892843 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:07.392572 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.392646 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:07.892874 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.892944 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.893199 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.393002 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.393080 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.393411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.893209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.893286 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.893674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:08.893724 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:09.392325 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.392655 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:09.892370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.392431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:11.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:11.392883 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:11.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.892939 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.392370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.892624 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:13.392523 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.392903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:13.392963 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:13.892519 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.892431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.892510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.392321 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.892434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.892732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:15.892778 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:16.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:16.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.893341 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.893646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.392499 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.392579 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.892490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:17.892807 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:18.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:18.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.393074 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.393141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.393442 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.893090 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.893170 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.893422 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:19.893473 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:20.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.393284 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.393611 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:20.893284 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.893361 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:22.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:22.392844 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:22.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.892425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.892708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.392485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.392394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.392659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.892838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:24.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:25.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:25.892357 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.892457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.892775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.392472 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.892582 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.892678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.893022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:26.893073 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:27.392724 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.392791 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.393032 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:27.893016 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.893101 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.893433 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.393326 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.892772 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:29.392456 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.392869 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:29.392915 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:29.892602 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.892673 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.892440 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.392566 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.892417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:31.892716 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:32.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.392776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:32.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.892858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:33.392332 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.397972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1218 00:43:33.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:33.892918 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:34.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:34.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.892442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.892725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.392438 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.392511 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.892534 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.892662 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:35.893039 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:36.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:36.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.392739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.892912 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.892985 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.893251 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:37.893290 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:38.393068 1195787 node_ready.go:38] duration metric: took 6m0.000870722s for node "functional-288604" to be "Ready" ...
	I1218 00:43:38.396243 1195787 out.go:203] 
	W1218 00:43:38.399208 1195787 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:43:38.399223 1195787 out.go:285] * 
	W1218 00:43:38.401353 1195787 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:43:38.404386 1195787 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409510412Z" level=info msg="Using the internal default seccomp profile"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409547589Z" level=info msg="AppArmor is disabled by the system or at CRI-O build-time"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409556253Z" level=info msg="No blockio config file specified, blockio not configured"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409562595Z" level=info msg="RDT not available in the host system"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409584781Z" level=info msg="Using conmon executable: /usr/libexec/crio/conmon"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.410490591Z" level=info msg="Conmon does support the --sync option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.410514878Z" level=info msg="Conmon does support the --log-global-size-max option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.410537728Z" level=info msg="Using conmon executable: /usr/libexec/crio/conmon"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.411154759Z" level=info msg="Conmon does support the --sync option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.411175239Z" level=info msg="Conmon does support the --log-global-size-max option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.411325043Z" level=info msg="Updated default CNI network name to "
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.4119085Z" level=info msg="Current CRI-O configuration:\n[crio]\n  root = \"/var/lib/containers/storage\"\n  runroot = \"/run/containers/storage\"\n  imagestore = \"\"\n  storage_driver = \"overlay\"\n  log_dir = \"/var/log/crio/pods\"\n  version_file = \"/var/run/crio/version\"\n  version_file_persist = \"\"\n  clean_shutdown_file = \"/var/lib/crio/clean.shutdown\"\n  internal_wipe = true\n  internal_repair = true\n  [crio.api]\n    grpc_max_send_msg_size = 83886080\n    grpc_max_recv_msg_size = 83886080\n    listen = \"/var/run/crio/crio.sock\"\n    stream_address = \"127.0.0.1\"\n    stream_port = \"0\"\n    stream_enable_tls = false\n    stream_tls_cert = \"\"\n    stream_tls_key = \"\"\n    stream_tls_ca = \"\"\n    stream_idle_timeout = \"\"\n  [crio.runtime]\n    no_pivot = false\n    selinux = false\n    log_to_journald = false\n    drop_infra_ctr = true\n    read_only = false\n    hooks_dir = [\"/usr/share/containers/oci/
hooks.d\"]\n    default_capabilities = [\"CHOWN\", \"DAC_OVERRIDE\", \"FSETID\", \"FOWNER\", \"SETGID\", \"SETUID\", \"SETPCAP\", \"NET_BIND_SERVICE\", \"KILL\"]\n    add_inheritable_capabilities = false\n    default_sysctls = [\"net.ipv4.ip_unprivileged_port_start=0\"]\n    allowed_devices = [\"/dev/fuse\", \"/dev/net/tun\"]\n    cdi_spec_dirs = [\"/etc/cdi\", \"/var/run/cdi\"]\n    device_ownership_from_security_context = false\n    default_runtime = \"crun\"\n    decryption_keys_path = \"/etc/crio/keys/\"\n    conmon = \"\"\n    conmon_cgroup = \"pod\"\n    seccomp_profile = \"\"\n    privileged_seccomp_profile = \"\"\n    apparmor_profile = \"crio-default\"\n    blockio_config_file = \"\"\n    blockio_reload = false\n    irqbalance_config_file = \"/etc/sysconfig/irqbalance\"\n    rdt_config_file = \"\"\n    cgroup_manager = \"cgroupfs\"\n    default_mounts_file = \"\"\n    container_exits_dir = \"/var/run/crio/exits\"\n    container_attach_socket_dir = \"/var/run/crio\"\n    bind_mount_prefix = \"\"\n
uid_mappings = \"\"\n    minimum_mappable_uid = -1\n    gid_mappings = \"\"\n    minimum_mappable_gid = -1\n    log_level = \"info\"\n    log_filter = \"\"\n    namespaces_dir = \"/var/run\"\n    pinns_path = \"/usr/bin/pinns\"\n    enable_criu_support = false\n    pids_limit = -1\n    log_size_max = -1\n    ctr_stop_timeout = 30\n    separate_pull_cgroup = \"\"\n    infra_ctr_cpuset = \"\"\n    shared_cpuset = \"\"\n    enable_pod_events = false\n    irqbalance_config_restore_file = \"/etc/sysconfig/orig_irq_banned_cpus\"\n    hostnetwork_disable_selinux = true\n    disable_hostport_mapping = false\n    timezone = \"\"\n    [crio.runtime.runtimes]\n      [crio.runtime.runtimes.crun]\n        runtime_config_path = \"\"\n        runtime_path = \"/usr/libexec/crio/crun\"\n        runtime_type = \"\"\n        runtime_root = \"/run/crun\"\n        allowed_annotations = [\"io.containers.trace-syscall\"]\n        monitor_path = \"/usr/libexec/crio/conmon\"\n        monitor_cgroup = \"pod\"\n        container_min_me
mory = \"12MiB\"\n        no_sync_log = false\n      [crio.runtime.runtimes.runc]\n        runtime_config_path = \"\"\n        runtime_path = \"/usr/libexec/crio/runc\"\n        runtime_type = \"\"\n        runtime_root = \"/run/runc\"\n        monitor_path = \"/usr/libexec/crio/conmon\"\n        monitor_cgroup = \"pod\"\n        container_min_memory = \"12MiB\"\n        no_sync_log = false\n  [crio.image]\n    default_transport = \"docker://\"\n    global_auth_file = \"\"\n    namespaced_auth_dir = \"/etc/crio/auth\"\n    pause_image = \"registry.k8s.io/pause:3.10.1\"\n    pause_image_auth_file = \"\"\n    pause_command = \"/pause\"\n    signature_policy = \"/etc/crio/policy.json\"\n    signature_policy_dir = \"/etc/crio/policies\"\n    image_volumes = \"mkdir\"\n    big_files_temporary_dir = \"\"\n    auto_reload_registries = false\n    pull_progress_timeout = \"0s\"\n    oci_artifact_mount_support = true\n    short_name_mode = \"enforcing\"\n  [crio.network]\n    cni_default_network = \"\"\n    network_dir
= \"/etc/cni/net.d/\"\n    plugin_dirs = [\"/opt/cni/bin/\"]\n  [crio.metrics]\n    enable_metrics = false\n    metrics_collectors = [\"image_pulls_layer_size\", \"containers_events_dropped_total\", \"containers_oom_total\", \"processes_defunct\", \"operations_total\", \"operations_latency_seconds\", \"operations_latency_seconds_total\", \"operations_errors_total\", \"image_pulls_bytes_total\", \"image_pulls_skipped_bytes_total\", \"image_pulls_failure_total\", \"image_pulls_success_total\", \"image_layer_reuse_total\", \"containers_oom_count_total\", \"containers_seccomp_notifier_count_total\", \"resources_stalled_at_stage\", \"containers_stopped_monitor_count\"]\n    metrics_host = \"127.0.0.1\"\n    metrics_port = 9090\n    metrics_socket = \"\"\n    metrics_cert = \"\"\n    metrics_key = \"\"\n  [crio.tracing]\n    enable_tracing = false\n    tracing_endpoint = \"127.0.0.1:4317\"\n    tracing_sampling_rate_per_million = 0\n  [crio.stats]\n    stats_collection_period = 0\n    collection_period = 0\n  [cri
o.nri]\n    enable_nri = true\n    nri_listen = \"/var/run/nri/nri.sock\"\n    nri_plugin_dir = \"/opt/nri/plugins\"\n    nri_plugin_config_dir = \"/etc/nri/conf.d\"\n    nri_plugin_registration_timeout = \"5s\"\n    nri_plugin_request_timeout = \"2s\"\n    nri_disable_connections = false\n    [crio.nri.default_validator]\n      nri_enable_default_validator = false\n      nri_validator_reject_oci_hook_adjustment = false\n      nri_validator_reject_runtime_default_seccomp_adjustment = false\n      nri_validator_reject_unconfined_seccomp_adjustment = false\n      nri_validator_reject_custom_seccomp_adjustment = false\n      nri_validator_reject_namespace_adjustment = false\n      nri_validator_tolerate_missing_plugins_annotation = \"\"\n"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.412366742Z" level=info msg="Attempting to restore irqbalance config from /etc/sysconfig/orig_irq_banned_cpus"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.412420673Z" level=info msg="Restore irqbalance config: failed to get current CPU ban list, ignoring"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477189227Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477239974Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477311602Z" level=info msg="Create NRI interface"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477429572Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477452775Z" level=info msg="runtime interface created"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477466354Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.47747908Z" level=info msg="runtime interface starting up..."
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477484955Z" level=info msg="starting plugins..."
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477502275Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477588312Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:37:35 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:43:40.343283    8615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:40.345334    8615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:40.345654    8615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:40.347563    8615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:40.347921    8615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:43:40 up  7:26,  0 user,  load average: 0.34, 0.22, 0.59
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:43:37 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:38 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1138.
	Dec 18 00:43:38 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:38 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:38 functional-288604 kubelet[8507]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:38 functional-288604 kubelet[8507]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:38 functional-288604 kubelet[8507]: E1218 00:43:38.703292    8507 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:38 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:38 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:39 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1139.
	Dec 18 00:43:39 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:39 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:39 functional-288604 kubelet[8526]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:39 functional-288604 kubelet[8526]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:39 functional-288604 kubelet[8526]: E1218 00:43:39.374012    8526 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:39 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:39 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1140.
	Dec 18 00:43:40 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:40 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:40 functional-288604 kubelet[8579]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:40 functional-288604 kubelet[8579]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:40 functional-288604 kubelet[8579]: E1218 00:43:40.201725    8579 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (385.240607ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart (369.03s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods (2.64s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-288604 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-288604 get po -A: exit status 1 (63.624459ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-288604 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-288604 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-288604 get po -A"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (344.444717ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 logs -n 25: (1.144454849s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                           ARGS                                                                            │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-240845 ssh sudo cat /usr/share/ca-certificates/1159552.pem                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ image          │ functional-240845 image save kicbase/echo-server:functional-240845 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                  │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:28 UTC │
	│ image          │ functional-240845 image rm kicbase/echo-server:functional-240845 --alsologtostderr                                                                        │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:28 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/11595522.pem                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /usr/share/ca-certificates/11595522.pem                                                                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                  │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh sudo cat /etc/test/nested/copy/1159552/hosts                                                                                        │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image save --daemon kicbase/echo-server:functional-240845 --alsologtostderr                                                             │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ update-context │ functional-240845 update-context --alsologtostderr -v=2                                                                                                   │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format json --alsologtostderr                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh            │ functional-240845 ssh pgrep buildkitd                                                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image          │ functional-240845 image ls --format yaml --alsologtostderr                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                                    │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format table --alsologtostderr                                                                                               │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls --format short --alsologtostderr                                                                                               │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image          │ functional-240845 image ls                                                                                                                                │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete         │ -p functional-240845                                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start          │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ start          │ -p functional-288604 --alsologtostderr -v=8                                                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:37 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:37:32
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:37:32.486183 1195787 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:37:32.486610 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486624 1195787 out.go:374] Setting ErrFile to fd 2...
	I1218 00:37:32.486629 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486918 1195787 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:37:32.487313 1195787 out.go:368] Setting JSON to false
	I1218 00:37:32.488152 1195787 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26401,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:37:32.488255 1195787 start.go:143] virtualization:  
	I1218 00:37:32.491971 1195787 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:37:32.494842 1195787 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:37:32.494944 1195787 notify.go:221] Checking for updates...
	I1218 00:37:32.500434 1195787 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:37:32.503311 1195787 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:32.506071 1195787 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:37:32.508979 1195787 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:37:32.511873 1195787 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:37:32.515326 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:32.515476 1195787 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:37:32.549560 1195787 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:37:32.549709 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.608968 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.600331572 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.609068 1195787 docker.go:319] overlay module found
	I1218 00:37:32.612053 1195787 out.go:179] * Using the docker driver based on existing profile
	I1218 00:37:32.614859 1195787 start.go:309] selected driver: docker
	I1218 00:37:32.614879 1195787 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.614985 1195787 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:37:32.615081 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.681718 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.67244891 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.682130 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:32.682189 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:32.682255 1195787 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.687138 1195787 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:37:32.690134 1195787 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:37:32.693078 1195787 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:37:32.696069 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:32.696123 1195787 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:37:32.696143 1195787 cache.go:65] Caching tarball of preloaded images
	I1218 00:37:32.696183 1195787 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:37:32.696303 1195787 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:37:32.696317 1195787 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:37:32.696417 1195787 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:37:32.714975 1195787 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:37:32.714995 1195787 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:37:32.715013 1195787 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:37:32.715043 1195787 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:37:32.715099 1195787 start.go:364] duration metric: took 33.796µs to acquireMachinesLock for "functional-288604"
	I1218 00:37:32.715121 1195787 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:37:32.715131 1195787 fix.go:54] fixHost starting: 
	I1218 00:37:32.715395 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:32.731575 1195787 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:37:32.731606 1195787 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:37:32.734910 1195787 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:37:32.734955 1195787 machine.go:94] provisionDockerMachine start ...
	I1218 00:37:32.735034 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.751418 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.751747 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.751760 1195787 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:37:32.904326 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:32.904350 1195787 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:37:32.904413 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.933199 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.933525 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.933536 1195787 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:37:33.096692 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:33.096816 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.115124 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.115445 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.115466 1195787 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:37:33.272592 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:37:33.272617 1195787 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:37:33.272637 1195787 ubuntu.go:190] setting up certificates
	I1218 00:37:33.272647 1195787 provision.go:84] configureAuth start
	I1218 00:37:33.272712 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:33.291737 1195787 provision.go:143] copyHostCerts
	I1218 00:37:33.291803 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291863 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:37:33.291880 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291977 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:37:33.292105 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292127 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:37:33.292137 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292177 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:37:33.292274 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292300 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:37:33.292315 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292347 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:37:33.292433 1195787 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:37:33.397529 1195787 provision.go:177] copyRemoteCerts
	I1218 00:37:33.397646 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:37:33.397692 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.416603 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:33.523879 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:37:33.523950 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:37:33.540143 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:37:33.540204 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:37:33.557091 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:37:33.557194 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:37:33.573937 1195787 provision.go:87] duration metric: took 301.27685ms to configureAuth
	I1218 00:37:33.573963 1195787 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:37:33.574138 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:33.574247 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.591351 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.591663 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.591676 1195787 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:37:33.932454 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:37:33.932478 1195787 machine.go:97] duration metric: took 1.197515142s to provisionDockerMachine
	I1218 00:37:33.932490 1195787 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:37:33.932503 1195787 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:37:33.932581 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:37:33.932636 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.953296 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.060199 1195787 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:37:34.063627 1195787 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:37:34.063655 1195787 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:37:34.063660 1195787 command_runner.go:130] > VERSION_ID="12"
	I1218 00:37:34.063664 1195787 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:37:34.063680 1195787 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:37:34.063684 1195787 command_runner.go:130] > ID=debian
	I1218 00:37:34.063689 1195787 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:37:34.063694 1195787 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:37:34.063700 1195787 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:37:34.063783 1195787 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:37:34.063800 1195787 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:37:34.063810 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:37:34.063871 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:37:34.063955 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:37:34.063966 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:37:34.064048 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:37:34.064056 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:37:34.064100 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:37:34.071756 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:34.089207 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:37:34.106978 1195787 start.go:296] duration metric: took 174.472072ms for postStartSetup
	I1218 00:37:34.107054 1195787 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:37:34.107096 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.124265 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.224786 1195787 command_runner.go:130] > 12%
	I1218 00:37:34.224858 1195787 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:37:34.228879 1195787 command_runner.go:130] > 171G
	I1218 00:37:34.229324 1195787 fix.go:56] duration metric: took 1.514188493s for fixHost
	I1218 00:37:34.229353 1195787 start.go:83] releasing machines lock for "functional-288604", held for 1.514233177s
	I1218 00:37:34.229425 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:34.246154 1195787 ssh_runner.go:195] Run: cat /version.json
	I1218 00:37:34.246206 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.246451 1195787 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:37:34.246509 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.266363 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.276260 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.371623 1195787 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:37:34.371754 1195787 ssh_runner.go:195] Run: systemctl --version
	I1218 00:37:34.461010 1195787 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:37:34.461057 1195787 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:37:34.461077 1195787 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:37:34.461152 1195787 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:37:34.497659 1195787 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:37:34.501645 1195787 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:37:34.502005 1195787 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:37:34.502070 1195787 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:37:34.509755 1195787 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:37:34.509780 1195787 start.go:496] detecting cgroup driver to use...
	I1218 00:37:34.509811 1195787 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:37:34.509875 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:37:34.523916 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:37:34.536646 1195787 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:37:34.536736 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:37:34.551504 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:37:34.564054 1195787 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:37:34.675890 1195787 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:37:34.798642 1195787 docker.go:234] disabling docker service ...
	I1218 00:37:34.798703 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:37:34.813006 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:37:34.825087 1195787 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:37:34.942798 1195787 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:37:35.067868 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:37:35.088600 1195787 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:37:35.102366 1195787 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:37:35.103752 1195787 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:37:35.103819 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.113147 1195787 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:37:35.113241 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.122530 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.131393 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.140799 1195787 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:37:35.148737 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.157396 1195787 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.165643 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.174650 1195787 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:37:35.181215 1195787 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:37:35.182122 1195787 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:37:35.189136 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.306446 1195787 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:37:35.483449 1195787 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:37:35.483550 1195787 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:37:35.487145 1195787 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:37:35.487172 1195787 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:37:35.487179 1195787 command_runner.go:130] > Device: 0,72	Inode: 1642        Links: 1
	I1218 00:37:35.487186 1195787 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:35.487202 1195787 command_runner.go:130] > Access: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487220 1195787 command_runner.go:130] > Modify: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487225 1195787 command_runner.go:130] > Change: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487229 1195787 command_runner.go:130] >  Birth: -
	I1218 00:37:35.487254 1195787 start.go:564] Will wait 60s for crictl version
	I1218 00:37:35.487306 1195787 ssh_runner.go:195] Run: which crictl
	I1218 00:37:35.490344 1195787 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:37:35.490683 1195787 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:37:35.512944 1195787 command_runner.go:130] > Version:  0.1.0
	I1218 00:37:35.513232 1195787 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:37:35.513363 1195787 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:37:35.513391 1195787 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:37:35.515559 1195787 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:37:35.515677 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.541522 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.541589 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.541609 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.541630 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.541651 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.541672 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.541692 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.541720 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.541741 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.541768 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.541786 1195787 command_runner.go:130] >      static
	I1218 00:37:35.541805 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.541829 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.541856 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.541889 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.541915 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.541933 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.541952 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.541983 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.541999 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.543191 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.569029 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.569102 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.569122 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.569144 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.569164 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.569191 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.569210 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.569239 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.569267 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.569285 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.569302 1195787 command_runner.go:130] >      static
	I1218 00:37:35.569320 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.569347 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.569366 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.569384 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.569405 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.569429 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.569449 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.569467 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.569485 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.575974 1195787 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:37:35.578737 1195787 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:37:35.594362 1195787 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:37:35.598161 1195787 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:37:35.598363 1195787 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFir
mwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:37:35.598485 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:35.598543 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.635547 1195787 command_runner.go:130] > {
	I1218 00:37:35.635578 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.635584 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635591 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.635596 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635602 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.635605 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635609 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635623 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.635631 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.635634 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635639 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.635643 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635650 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635654 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635657 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635668 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.635672 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635677 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.635680 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635684 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635693 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.635701 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.635704 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635709 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.635712 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635719 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635723 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635731 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.635735 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635740 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.635743 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635747 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635758 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.635773 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.635777 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635781 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.635786 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.635790 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635793 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635795 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635802 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.635805 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635810 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.635815 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635823 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635830 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.635838 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.635841 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635845 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.635848 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635852 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635855 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635864 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635868 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635872 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635875 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635881 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.635885 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635890 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.635893 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635897 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635905 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.635912 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.635915 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635920 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.635926 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635930 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635934 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635938 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635941 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635944 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635947 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635954 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.635957 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635963 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.635966 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635970 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635978 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.635986 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.635989 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635993 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.635997 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636000 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636003 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636007 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636011 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636013 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636016 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636022 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.636027 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636032 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.636035 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636039 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636046 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.636054 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.636057 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636060 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.636064 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636073 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636077 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636079 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636086 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.636090 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636095 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.636098 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636102 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636110 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.636125 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.636133 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636137 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.636140 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636144 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636147 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636151 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636154 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636158 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636160 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636166 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.636170 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636175 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.636178 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636182 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636190 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.636197 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.636200 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636204 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.636208 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636211 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.636214 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636238 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636243 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.636251 1195787 command_runner.go:130] >     }
	I1218 00:37:35.636254 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.636256 1195787 command_runner.go:130] > }
	I1218 00:37:35.636431 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.636439 1195787 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:37:35.636495 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.658094 1195787 command_runner.go:130] > {
	I1218 00:37:35.658111 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.658115 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658124 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.658128 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658134 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.658137 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658141 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658151 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.658159 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.658163 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658167 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.658171 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658176 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658179 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658182 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658189 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.658192 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658198 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.658201 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658205 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658213 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.658222 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.658225 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658229 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.658233 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658242 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658250 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658262 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658269 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.658273 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658279 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.658282 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658286 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658294 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.658302 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.658305 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658309 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.658313 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.658317 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658321 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658323 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658330 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.658334 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658339 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.658344 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658348 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658356 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.658367 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.658370 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658374 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.658378 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658382 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658384 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658393 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658397 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658400 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658403 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658410 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.658413 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658425 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.658431 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658435 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658443 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.658455 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.658465 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658469 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.658472 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658476 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658479 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658483 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658487 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658490 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658493 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658499 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.658503 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658508 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.658511 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658515 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658523 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.658532 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.658535 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658539 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.658543 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658549 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658552 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658556 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658560 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658563 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658566 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658572 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.658577 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658582 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.658589 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658598 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658605 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.658613 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.658616 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658620 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.658624 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658628 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658631 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658642 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658650 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.658653 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658659 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.658662 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658666 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658677 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.658694 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.658697 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658701 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.658705 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658708 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658711 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658715 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658718 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658721 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658731 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.658734 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658739 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.658742 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658746 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658754 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.658761 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.658772 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658777 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.658781 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658784 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.658788 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658791 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658794 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.658798 1195787 command_runner.go:130] >     }
	I1218 00:37:35.658800 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.658803 1195787 command_runner.go:130] > }
	I1218 00:37:35.660205 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.660262 1195787 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:37:35.660279 1195787 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:37:35.660385 1195787 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:37:35.660470 1195787 ssh_runner.go:195] Run: crio config
	I1218 00:37:35.707278 1195787 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:37:35.707300 1195787 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:37:35.707307 1195787 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:37:35.707310 1195787 command_runner.go:130] > #
	I1218 00:37:35.707318 1195787 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:37:35.707324 1195787 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:37:35.707330 1195787 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:37:35.707346 1195787 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:37:35.707349 1195787 command_runner.go:130] > # reload'.
	I1218 00:37:35.707356 1195787 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:37:35.707362 1195787 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:37:35.707368 1195787 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:37:35.707383 1195787 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:37:35.707387 1195787 command_runner.go:130] > [crio]
	I1218 00:37:35.707393 1195787 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:37:35.707398 1195787 command_runner.go:130] > # containers images, in this directory.
	I1218 00:37:35.707595 1195787 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:37:35.707607 1195787 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:37:35.707620 1195787 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:37:35.707627 1195787 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:37:35.707631 1195787 command_runner.go:130] > # imagestore = ""
	I1218 00:37:35.707637 1195787 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:37:35.707643 1195787 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:37:35.707768 1195787 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:37:35.707777 1195787 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:37:35.707784 1195787 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:37:35.707788 1195787 command_runner.go:130] > # storage_option = [
	I1218 00:37:35.707935 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.707952 1195787 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:37:35.707959 1195787 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:37:35.707971 1195787 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:37:35.707978 1195787 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:37:35.707984 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:37:35.707990 1195787 command_runner.go:130] > # always happen on a node reboot
	I1218 00:37:35.708138 1195787 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:37:35.708160 1195787 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:37:35.708174 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:37:35.708183 1195787 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:37:35.708342 1195787 command_runner.go:130] > # version_file_persist = ""
	I1218 00:37:35.708354 1195787 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:37:35.708363 1195787 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:37:35.708367 1195787 command_runner.go:130] > # internal_wipe = true
	I1218 00:37:35.708381 1195787 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:37:35.708388 1195787 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:37:35.708503 1195787 command_runner.go:130] > # internal_repair = true
	I1218 00:37:35.708512 1195787 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:37:35.708519 1195787 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:37:35.708525 1195787 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:37:35.708671 1195787 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:37:35.708682 1195787 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:37:35.708686 1195787 command_runner.go:130] > [crio.api]
	I1218 00:37:35.708706 1195787 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:37:35.708833 1195787 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:37:35.708843 1195787 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:37:35.708997 1195787 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:37:35.709007 1195787 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:37:35.709013 1195787 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:37:35.709016 1195787 command_runner.go:130] > # stream_port = "0"
	I1218 00:37:35.709022 1195787 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:37:35.709150 1195787 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:37:35.709160 1195787 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:37:35.709282 1195787 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:37:35.709292 1195787 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:37:35.709298 1195787 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709420 1195787 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:37:35.709430 1195787 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:37:35.709436 1195787 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709440 1195787 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:37:35.709447 1195787 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:37:35.709453 1195787 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:37:35.709462 1195787 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:37:35.709593 1195787 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:37:35.709614 1195787 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709735 1195787 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:37:35.709746 1195787 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709864 1195787 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:37:35.709875 1195787 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:37:35.709881 1195787 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:37:35.709885 1195787 command_runner.go:130] > [crio.runtime]
	I1218 00:37:35.709891 1195787 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:37:35.709896 1195787 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:37:35.709907 1195787 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:37:35.709913 1195787 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:37:35.709917 1195787 command_runner.go:130] > # default_ulimits = [
	I1218 00:37:35.710017 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710026 1195787 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:37:35.710154 1195787 command_runner.go:130] > # no_pivot = false
	I1218 00:37:35.710163 1195787 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:37:35.710170 1195787 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:37:35.710300 1195787 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:37:35.710309 1195787 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:37:35.710323 1195787 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:37:35.710336 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710476 1195787 command_runner.go:130] > # conmon = ""
	I1218 00:37:35.710485 1195787 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:37:35.710492 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:37:35.710496 1195787 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:37:35.710508 1195787 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:37:35.710514 1195787 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:37:35.710521 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710524 1195787 command_runner.go:130] > # conmon_env = [
	I1218 00:37:35.710624 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710633 1195787 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:37:35.710639 1195787 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:37:35.710644 1195787 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:37:35.710648 1195787 command_runner.go:130] > # default_env = [
	I1218 00:37:35.710790 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710800 1195787 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:37:35.710816 1195787 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:37:35.710953 1195787 command_runner.go:130] > # selinux = false
	I1218 00:37:35.710964 1195787 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:37:35.710972 1195787 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:37:35.710977 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.710981 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.710993 1195787 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:37:35.710999 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711131 1195787 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:37:35.711142 1195787 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:37:35.711149 1195787 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:37:35.711162 1195787 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:37:35.711169 1195787 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:37:35.711174 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711345 1195787 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:37:35.711373 1195787 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:37:35.711401 1195787 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:37:35.711419 1195787 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:37:35.711456 1195787 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:37:35.711477 1195787 command_runner.go:130] > # blockio parameters.
	I1218 00:37:35.711667 1195787 command_runner.go:130] > # blockio_reload = false
	I1218 00:37:35.711706 1195787 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:37:35.711725 1195787 command_runner.go:130] > # irqbalance daemon.
	I1218 00:37:35.711743 1195787 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:37:35.711776 1195787 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:37:35.711801 1195787 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:37:35.711821 1195787 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:37:35.711855 1195787 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:37:35.711879 1195787 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:37:35.711898 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.712052 1195787 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:37:35.712092 1195787 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:37:35.712112 1195787 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:37:35.712133 1195787 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:37:35.712151 1195787 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:37:35.712187 1195787 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:37:35.712206 1195787 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:37:35.712253 1195787 command_runner.go:130] > # will be added.
	I1218 00:37:35.712276 1195787 command_runner.go:130] > # default_capabilities = [
	I1218 00:37:35.712420 1195787 command_runner.go:130] > # 	"CHOWN",
	I1218 00:37:35.712461 1195787 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:37:35.712541 1195787 command_runner.go:130] > # 	"FSETID",
	I1218 00:37:35.712631 1195787 command_runner.go:130] > # 	"FOWNER",
	I1218 00:37:35.712660 1195787 command_runner.go:130] > # 	"SETGID",
	I1218 00:37:35.712794 1195787 command_runner.go:130] > # 	"SETUID",
	I1218 00:37:35.712896 1195787 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:37:35.712994 1195787 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:37:35.713065 1195787 command_runner.go:130] > # 	"KILL",
	I1218 00:37:35.713149 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.713172 1195787 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:37:35.713258 1195787 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:37:35.713410 1195787 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:37:35.713489 1195787 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:37:35.713545 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.713716 1195787 command_runner.go:130] > default_sysctls = [
	I1218 00:37:35.713734 1195787 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:37:35.713949 1195787 command_runner.go:130] > ]
	I1218 00:37:35.713959 1195787 command_runner.go:130] > # List of devices on the host that a
	I1218 00:37:35.713966 1195787 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:37:35.713970 1195787 command_runner.go:130] > # allowed_devices = [
	I1218 00:37:35.713995 1195787 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:37:35.714000 1195787 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:37:35.714003 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714008 1195787 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:37:35.714016 1195787 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:37:35.714022 1195787 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:37:35.714028 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.714032 1195787 command_runner.go:130] > # additional_devices = [
	I1218 00:37:35.714035 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714040 1195787 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:37:35.714044 1195787 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:37:35.714048 1195787 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:37:35.714052 1195787 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:37:35.714056 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714062 1195787 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:37:35.714068 1195787 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:37:35.714077 1195787 command_runner.go:130] > # Defaults to false.
	I1218 00:37:35.714083 1195787 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:37:35.714089 1195787 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:37:35.714100 1195787 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:37:35.714537 1195787 command_runner.go:130] > # hooks_dir = [
	I1218 00:37:35.714675 1195787 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:37:35.714791 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.715258 1195787 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:37:35.715414 1195787 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:37:35.715601 1195787 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:37:35.715650 1195787 command_runner.go:130] > #
	I1218 00:37:35.715843 1195787 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:37:35.715943 1195787 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:37:35.716060 1195787 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:37:35.716083 1195787 command_runner.go:130] > #
	I1218 00:37:35.716111 1195787 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:37:35.716131 1195787 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:37:35.716166 1195787 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:37:35.716187 1195787 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:37:35.716204 1195787 command_runner.go:130] > #
	I1218 00:37:35.716248 1195787 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:37:35.716275 1195787 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:37:35.716306 1195787 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:37:35.717368 1195787 command_runner.go:130] > # pids_limit = -1
	I1218 00:37:35.717418 1195787 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:37:35.717442 1195787 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:37:35.717463 1195787 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:37:35.717499 1195787 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:37:35.717521 1195787 command_runner.go:130] > # log_size_max = -1
	I1218 00:37:35.717693 1195787 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:37:35.717720 1195787 command_runner.go:130] > # log_to_journald = false
	I1218 00:37:35.717752 1195787 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:37:35.717776 1195787 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:37:35.717810 1195787 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:37:35.717835 1195787 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:37:35.717855 1195787 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:37:35.717888 1195787 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:37:35.717911 1195787 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:37:35.717929 1195787 command_runner.go:130] > # read_only = false
	I1218 00:37:35.717949 1195787 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:37:35.717978 1195787 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:37:35.717999 1195787 command_runner.go:130] > # live configuration reload.
	I1218 00:37:35.718017 1195787 command_runner.go:130] > # log_level = "info"
	I1218 00:37:35.718039 1195787 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:37:35.718073 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.718091 1195787 command_runner.go:130] > # log_filter = ""
	I1218 00:37:35.718112 1195787 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718144 1195787 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:37:35.718167 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718189 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718221 1195787 command_runner.go:130] > # uid_mappings = ""
	I1218 00:37:35.718243 1195787 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718262 1195787 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:37:35.718280 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718311 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718343 1195787 command_runner.go:130] > # gid_mappings = ""
	I1218 00:37:35.718363 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:37:35.718395 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718420 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718442 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718481 1195787 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:37:35.718507 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:37:35.718529 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718561 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718589 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718607 1195787 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:37:35.718641 1195787 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:37:35.718665 1195787 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:37:35.718685 1195787 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:37:35.718717 1195787 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:37:35.718741 1195787 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:37:35.718762 1195787 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:37:35.718793 1195787 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:37:35.718814 1195787 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:37:35.718831 1195787 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:37:35.718851 1195787 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:37:35.718882 1195787 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:37:35.718907 1195787 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:37:35.718931 1195787 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:37:35.718965 1195787 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:37:35.718989 1195787 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:37:35.719009 1195787 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:37:35.719039 1195787 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:37:35.719315 1195787 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:37:35.719348 1195787 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:37:35.719365 1195787 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:37:35.719396 1195787 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:37:35.719423 1195787 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:37:35.719450 1195787 command_runner.go:130] > # pinns_path = ""
	I1218 00:37:35.719484 1195787 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:37:35.719510 1195787 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:37:35.719528 1195787 command_runner.go:130] > # enable_criu_support = true
	I1218 00:37:35.719563 1195787 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:37:35.719586 1195787 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:37:35.719602 1195787 command_runner.go:130] > # enable_pod_events = false
	I1218 00:37:35.719622 1195787 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:37:35.719651 1195787 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:37:35.719672 1195787 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:37:35.719690 1195787 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:37:35.719711 1195787 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:37:35.719747 1195787 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:37:35.719770 1195787 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:37:35.719795 1195787 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:37:35.719826 1195787 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:37:35.719849 1195787 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:37:35.719865 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.719885 1195787 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:37:35.719916 1195787 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:37:35.719938 1195787 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:37:35.719957 1195787 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:37:35.719973 1195787 command_runner.go:130] > #
	I1218 00:37:35.720002 1195787 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:37:35.720024 1195787 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:37:35.720041 1195787 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:37:35.720059 1195787 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:37:35.720096 1195787 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:37:35.720121 1195787 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:37:35.720139 1195787 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:37:35.720170 1195787 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:37:35.720190 1195787 command_runner.go:130] > # monitor_env = []
	I1218 00:37:35.720207 1195787 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:37:35.720256 1195787 command_runner.go:130] > # allowed_annotations = []
	I1218 00:37:35.720274 1195787 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:37:35.720279 1195787 command_runner.go:130] > # no_sync_log = false
	I1218 00:37:35.720284 1195787 command_runner.go:130] > # default_annotations = {}
	I1218 00:37:35.720288 1195787 command_runner.go:130] > # stream_websockets = false
	I1218 00:37:35.720292 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.720348 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.720360 1195787 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:37:35.720367 1195787 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:37:35.720386 1195787 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:37:35.720399 1195787 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:37:35.720403 1195787 command_runner.go:130] > #   in $PATH.
	I1218 00:37:35.720418 1195787 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:37:35.720433 1195787 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:37:35.720439 1195787 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:37:35.720444 1195787 command_runner.go:130] > #   state.
	I1218 00:37:35.720451 1195787 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:37:35.720460 1195787 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:37:35.720466 1195787 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:37:35.720473 1195787 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:37:35.720480 1195787 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:37:35.720496 1195787 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:37:35.720506 1195787 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:37:35.720513 1195787 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:37:35.720531 1195787 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:37:35.720543 1195787 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:37:35.720550 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:37:35.720566 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:37:35.720576 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:37:35.720582 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:37:35.720590 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:37:35.720628 1195787 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:37:35.720649 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:37:35.720665 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:37:35.720671 1195787 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:37:35.720679 1195787 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:37:35.720689 1195787 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:37:35.720706 1195787 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:37:35.720719 1195787 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:37:35.720733 1195787 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:37:35.720746 1195787 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:37:35.720754 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:37:35.720760 1195787 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:37:35.720764 1195787 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:37:35.720772 1195787 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:37:35.720777 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:37:35.720783 1195787 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:37:35.720795 1195787 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:37:35.720813 1195787 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:37:35.720825 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:37:35.720833 1195787 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:37:35.720849 1195787 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:37:35.720857 1195787 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:37:35.720865 1195787 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:37:35.720875 1195787 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:37:35.720882 1195787 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:37:35.720888 1195787 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:37:35.720897 1195787 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:37:35.720905 1195787 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:37:35.720932 1195787 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:37:35.720946 1195787 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:37:35.720960 1195787 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:37:35.720968 1195787 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:37:35.720975 1195787 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:37:35.720983 1195787 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:37:35.720995 1195787 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:37:35.721013 1195787 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:37:35.721023 1195787 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:37:35.721031 1195787 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:37:35.721033 1195787 command_runner.go:130] > #
	I1218 00:37:35.721038 1195787 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:37:35.721043 1195787 command_runner.go:130] > #
	I1218 00:37:35.721049 1195787 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:37:35.721058 1195787 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:37:35.721061 1195787 command_runner.go:130] > #
	I1218 00:37:35.721072 1195787 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:37:35.721082 1195787 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:37:35.721085 1195787 command_runner.go:130] > #
	I1218 00:37:35.721091 1195787 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:37:35.721097 1195787 command_runner.go:130] > # feature.
	I1218 00:37:35.721100 1195787 command_runner.go:130] > #
	I1218 00:37:35.721106 1195787 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:37:35.721112 1195787 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:37:35.721119 1195787 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:37:35.721125 1195787 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:37:35.721131 1195787 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:37:35.721141 1195787 command_runner.go:130] > #
	I1218 00:37:35.721147 1195787 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:37:35.721153 1195787 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:37:35.721158 1195787 command_runner.go:130] > #
	I1218 00:37:35.721164 1195787 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:37:35.721170 1195787 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:37:35.721177 1195787 command_runner.go:130] > #
	I1218 00:37:35.721183 1195787 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:37:35.721188 1195787 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:37:35.721192 1195787 command_runner.go:130] > # limitation.
	I1218 00:37:35.721196 1195787 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:37:35.721200 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:37:35.721204 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721215 1195787 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:37:35.721228 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721232 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721236 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721241 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721248 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721251 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721255 1195787 command_runner.go:130] > allowed_annotations = [
	I1218 00:37:35.721261 1195787 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:37:35.721265 1195787 command_runner.go:130] > ]
	I1218 00:37:35.721270 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721274 1195787 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:37:35.721279 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:37:35.721282 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721289 1195787 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:37:35.721293 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721307 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721312 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721316 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721320 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721325 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721331 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721339 1195787 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:37:35.721347 1195787 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:37:35.721353 1195787 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:37:35.721361 1195787 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:37:35.721384 1195787 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:37:35.721399 1195787 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:37:35.721406 1195787 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:37:35.721417 1195787 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:37:35.721427 1195787 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:37:35.721438 1195787 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:37:35.721444 1195787 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:37:35.721457 1195787 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:37:35.721461 1195787 command_runner.go:130] > # Example:
	I1218 00:37:35.721466 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:37:35.721472 1195787 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:37:35.721477 1195787 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:37:35.721487 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:37:35.721490 1195787 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:37:35.721494 1195787 command_runner.go:130] > # cpushares = "5"
	I1218 00:37:35.721498 1195787 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:37:35.721502 1195787 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:37:35.721507 1195787 command_runner.go:130] > # cpulimit = "35"
	I1218 00:37:35.721510 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.721516 1195787 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:37:35.721524 1195787 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:37:35.721529 1195787 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:37:35.721535 1195787 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:37:35.721544 1195787 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:37:35.721552 1195787 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:37:35.721556 1195787 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:37:35.721563 1195787 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:37:35.721568 1195787 command_runner.go:130] > # Default value is set to true
	I1218 00:37:35.721574 1195787 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:37:35.721580 1195787 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:37:35.721588 1195787 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:37:35.721592 1195787 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:37:35.721621 1195787 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:37:35.721627 1195787 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:37:35.721635 1195787 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:37:35.721640 1195787 command_runner.go:130] > # timezone = ""
	I1218 00:37:35.721647 1195787 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:37:35.721650 1195787 command_runner.go:130] > #
	I1218 00:37:35.721656 1195787 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:37:35.721665 1195787 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:37:35.721672 1195787 command_runner.go:130] > [crio.image]
	I1218 00:37:35.721679 1195787 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:37:35.721683 1195787 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:37:35.721689 1195787 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:37:35.721701 1195787 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721706 1195787 command_runner.go:130] > # global_auth_file = ""
	I1218 00:37:35.721711 1195787 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:37:35.721723 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721728 1195787 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.721738 1195787 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:37:35.721745 1195787 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721754 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721758 1195787 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:37:35.721764 1195787 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:37:35.721769 1195787 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:37:35.721776 1195787 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:37:35.721781 1195787 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:37:35.721787 1195787 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:37:35.721793 1195787 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:37:35.721799 1195787 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:37:35.721805 1195787 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:37:35.721813 1195787 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:37:35.721819 1195787 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:37:35.721825 1195787 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:37:35.721831 1195787 command_runner.go:130] > # pinned_images = [
	I1218 00:37:35.721834 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721840 1195787 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:37:35.721846 1195787 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:37:35.721853 1195787 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:37:35.721859 1195787 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:37:35.721866 1195787 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:37:35.721871 1195787 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:37:35.721879 1195787 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:37:35.721892 1195787 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:37:35.721901 1195787 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:37:35.721912 1195787 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:37:35.721918 1195787 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:37:35.721923 1195787 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:37:35.721928 1195787 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:37:35.721935 1195787 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:37:35.721938 1195787 command_runner.go:130] > # changing them here.
	I1218 00:37:35.721944 1195787 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:37:35.721955 1195787 command_runner.go:130] > # insecure_registries = [
	I1218 00:37:35.721957 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721964 1195787 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:37:35.721969 1195787 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:37:35.721977 1195787 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:37:35.721983 1195787 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:37:35.721987 1195787 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:37:35.721998 1195787 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:37:35.722005 1195787 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:37:35.722009 1195787 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:37:35.722015 1195787 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:37:35.722024 1195787 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:37:35.722031 1195787 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:37:35.722036 1195787 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:37:35.722048 1195787 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:37:35.722054 1195787 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:37:35.722062 1195787 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:37:35.722070 1195787 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:37:35.722074 1195787 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:37:35.722081 1195787 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:37:35.722087 1195787 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:37:35.722091 1195787 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:37:35.722097 1195787 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:37:35.722108 1195787 command_runner.go:130] > # CNI plugins.
	I1218 00:37:35.722117 1195787 command_runner.go:130] > [crio.network]
	I1218 00:37:35.722131 1195787 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:37:35.722136 1195787 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:37:35.722142 1195787 command_runner.go:130] > # cni_default_network = ""
	I1218 00:37:35.722148 1195787 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:37:35.722156 1195787 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:37:35.722162 1195787 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:37:35.722165 1195787 command_runner.go:130] > # plugin_dirs = [
	I1218 00:37:35.722169 1195787 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:37:35.722172 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722176 1195787 command_runner.go:130] > # List of included pod metrics.
	I1218 00:37:35.722180 1195787 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:37:35.722182 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722190 1195787 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:37:35.722196 1195787 command_runner.go:130] > [crio.metrics]
	I1218 00:37:35.722201 1195787 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:37:35.722205 1195787 command_runner.go:130] > # enable_metrics = false
	I1218 00:37:35.722209 1195787 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:37:35.722215 1195787 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:37:35.722222 1195787 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:37:35.722233 1195787 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:37:35.722239 1195787 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:37:35.722243 1195787 command_runner.go:130] > # metrics_collectors = [
	I1218 00:37:35.722247 1195787 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:37:35.722252 1195787 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:37:35.722256 1195787 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:37:35.722260 1195787 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:37:35.722266 1195787 command_runner.go:130] > # 	"operations_total",
	I1218 00:37:35.722270 1195787 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:37:35.722275 1195787 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:37:35.722279 1195787 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:37:35.722283 1195787 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:37:35.722287 1195787 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:37:35.722295 1195787 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:37:35.722299 1195787 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:37:35.722312 1195787 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:37:35.722316 1195787 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:37:35.722321 1195787 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:37:35.722325 1195787 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:37:35.722329 1195787 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:37:35.722332 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722338 1195787 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:37:35.722342 1195787 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:37:35.722347 1195787 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:37:35.722351 1195787 command_runner.go:130] > # metrics_port = 9090
	I1218 00:37:35.722358 1195787 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:37:35.722362 1195787 command_runner.go:130] > # metrics_socket = ""
	I1218 00:37:35.722377 1195787 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:37:35.722386 1195787 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:37:35.722398 1195787 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:37:35.722403 1195787 command_runner.go:130] > # certificate on any modification event.
	I1218 00:37:35.722406 1195787 command_runner.go:130] > # metrics_cert = ""
	I1218 00:37:35.722411 1195787 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:37:35.722421 1195787 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:37:35.722424 1195787 command_runner.go:130] > # metrics_key = ""
	I1218 00:37:35.722433 1195787 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:37:35.722437 1195787 command_runner.go:130] > [crio.tracing]
	I1218 00:37:35.722445 1195787 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:37:35.722451 1195787 command_runner.go:130] > # enable_tracing = false
	I1218 00:37:35.722464 1195787 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:37:35.722472 1195787 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:37:35.722479 1195787 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:37:35.722485 1195787 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:37:35.722490 1195787 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:37:35.722493 1195787 command_runner.go:130] > [crio.nri]
	I1218 00:37:35.722498 1195787 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:37:35.722507 1195787 command_runner.go:130] > # enable_nri = true
	I1218 00:37:35.722519 1195787 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:37:35.722524 1195787 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:37:35.722528 1195787 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:37:35.722539 1195787 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:37:35.722544 1195787 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:37:35.722549 1195787 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:37:35.722557 1195787 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:37:35.722613 1195787 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:37:35.722623 1195787 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:37:35.722628 1195787 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:37:35.722634 1195787 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:37:35.722640 1195787 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:37:35.722645 1195787 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:37:35.722651 1195787 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:37:35.722658 1195787 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:37:35.722663 1195787 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:37:35.722666 1195787 command_runner.go:130] > # - OCI hook injection
	I1218 00:37:35.722671 1195787 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:37:35.722677 1195787 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:37:35.722683 1195787 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:37:35.722689 1195787 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:37:35.722696 1195787 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:37:35.722702 1195787 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:37:35.722709 1195787 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:37:35.722712 1195787 command_runner.go:130] > #
	I1218 00:37:35.722717 1195787 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:37:35.722724 1195787 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:37:35.722729 1195787 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:37:35.722734 1195787 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:37:35.722739 1195787 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:37:35.722744 1195787 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:37:35.722749 1195787 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:37:35.722759 1195787 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:37:35.722765 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722771 1195787 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:37:35.722777 1195787 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:37:35.722788 1195787 command_runner.go:130] > [crio.stats]
	I1218 00:37:35.722797 1195787 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:37:35.722805 1195787 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:37:35.722809 1195787 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:37:35.722814 1195787 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:37:35.722821 1195787 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:37:35.722825 1195787 command_runner.go:130] > # collection_period = 0
	I1218 00:37:35.722870 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686277403Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:37:35.722885 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686455769Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:37:35.722906 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686635242Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:37:35.722915 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686725939Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:37:35.722930 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686860827Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.722940 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.687143526Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:37:35.722954 1195787 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:37:35.723070 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:35.723084 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:35.723105 1195787 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:37:35.723135 1195787 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath
:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:37:35.723264 1195787 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:37:35.723342 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:37:35.730799 1195787 command_runner.go:130] > kubeadm
	I1218 00:37:35.730815 1195787 command_runner.go:130] > kubectl
	I1218 00:37:35.730820 1195787 command_runner.go:130] > kubelet
	I1218 00:37:35.730852 1195787 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:37:35.730903 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:37:35.737892 1195787 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:37:35.749699 1195787 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:37:35.761635 1195787 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2219 bytes)
	I1218 00:37:35.773650 1195787 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:37:35.777155 1195787 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:37:35.777265 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.913809 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:36.641224 1195787 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:37:36.641246 1195787 certs.go:195] generating shared ca certs ...
	I1218 00:37:36.641263 1195787 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:36.641410 1195787 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:37:36.641464 1195787 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:37:36.641475 1195787 certs.go:257] generating profile certs ...
	I1218 00:37:36.641577 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:37:36.641667 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:37:36.641711 1195787 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:37:36.641724 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:37:36.641737 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:37:36.641753 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:37:36.641763 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:37:36.641780 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:37:36.641792 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:37:36.641807 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:37:36.641818 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:37:36.641873 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:37:36.641907 1195787 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:37:36.641920 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:37:36.641952 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:37:36.641982 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:37:36.642014 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:37:36.642068 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:36.642106 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:37:36.642122 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.642133 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.642704 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:37:36.662928 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:37:36.685489 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:37:36.708038 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:37:36.726679 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:37:36.744109 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:37:36.760724 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:37:36.777802 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:37:36.794736 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:37:36.811089 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:37:36.827838 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:37:36.844718 1195787 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:37:36.856626 1195787 ssh_runner.go:195] Run: openssl version
	I1218 00:37:36.862122 1195787 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:37:36.862595 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.869813 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:37:36.876968 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880287 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880319 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880364 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.920445 1195787 command_runner.go:130] > b5213941
	I1218 00:37:36.920887 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:37:36.928015 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.934857 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:37:36.941992 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945456 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945522 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945583 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.985712 1195787 command_runner.go:130] > 51391683
	I1218 00:37:36.986191 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:37:36.993294 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.001803 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:37:37.011590 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.016819 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017267 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017348 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.061113 1195787 command_runner.go:130] > 3ec20f2e
	I1218 00:37:37.061606 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:37:37.068668 1195787 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072025 1195787 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072050 1195787 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:37:37.072057 1195787 command_runner.go:130] > Device: 259,1	Inode: 1326178     Links: 1
	I1218 00:37:37.072063 1195787 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:37.072070 1195787 command_runner.go:130] > Access: 2025-12-18 00:33:28.828061434 +0000
	I1218 00:37:37.072075 1195787 command_runner.go:130] > Modify: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072080 1195787 command_runner.go:130] > Change: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072086 1195787 command_runner.go:130] >  Birth: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072155 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:37:37.111978 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.112489 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:37:37.152999 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.153074 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:37:37.194884 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.195292 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:37:37.235218 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.235658 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:37:37.275710 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.276177 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:37:37.316082 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.316486 1195787 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwa
rePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:37.316593 1195787 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:37:37.316685 1195787 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:37:37.341722 1195787 cri.go:89] found id: ""
	I1218 00:37:37.341828 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:37:37.348335 1195787 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:37:37.348357 1195787 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:37:37.348372 1195787 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:37:37.349183 1195787 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:37:37.349197 1195787 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:37:37.349253 1195787 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:37:37.356307 1195787 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:37:37.356734 1195787 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.356836 1195787 kubeconfig.go:62] /home/jenkins/minikube-integration/22186-1156339/kubeconfig needs updating (will repair): [kubeconfig missing "functional-288604" cluster setting kubeconfig missing "functional-288604" context setting]
	I1218 00:37:37.357097 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.357514 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.357675 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.358178 1195787 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:37:37.358185 1195787 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:37:37.358343 1195787 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:37:37.358365 1195787 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:37:37.358389 1195787 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:37:37.358400 1195787 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:37:37.358747 1195787 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:37:37.366250 1195787 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:37:37.366287 1195787 kubeadm.go:602] duration metric: took 17.084351ms to restartPrimaryControlPlane
	I1218 00:37:37.366297 1195787 kubeadm.go:403] duration metric: took 49.819997ms to StartCluster
	I1218 00:37:37.366310 1195787 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.366369 1195787 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.366947 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.367145 1195787 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:37:37.367532 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:37.367580 1195787 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:37:37.367705 1195787 addons.go:70] Setting storage-provisioner=true in profile "functional-288604"
	I1218 00:37:37.367724 1195787 addons.go:239] Setting addon storage-provisioner=true in "functional-288604"
	I1218 00:37:37.367744 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.368436 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.368583 1195787 addons.go:70] Setting default-storageclass=true in profile "functional-288604"
	I1218 00:37:37.368601 1195787 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-288604"
	I1218 00:37:37.368944 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.373199 1195787 out.go:179] * Verifying Kubernetes components...
	I1218 00:37:37.376080 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:37.397822 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.397983 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.398246 1195787 addons.go:239] Setting addon default-storageclass=true in "functional-288604"
	I1218 00:37:37.398278 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.398894 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.407451 1195787 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:37:37.410300 1195787 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.410322 1195787 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:37:37.410384 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.434096 1195787 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:37.434117 1195787 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:37:37.434174 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.457842 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.477819 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.583963 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:37.618382 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.637024 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.392142 1195787 node_ready.go:35] waiting up to 6m0s for node "functional-288604" to be "Ready" ...
	I1218 00:37:38.392289 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.392356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.392602 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392638 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392662 1195787 retry.go:31] will retry after 293.380468ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392710 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392727 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392733 1195787 retry.go:31] will retry after 283.333163ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:38.676355 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.686660 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:38.750557 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753745 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753775 1195787 retry.go:31] will retry after 508.906429ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753840 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753899 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753916 1195787 retry.go:31] will retry after 283.918132ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.893115 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.893199 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.893535 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.038817 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.092066 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.095485 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.095518 1195787 retry.go:31] will retry after 317.14343ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.262906 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.318327 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.322166 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.322196 1195787 retry.go:31] will retry after 611.398612ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.392378 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.413200 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.474250 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.474288 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.474326 1195787 retry.go:31] will retry after 551.991324ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.892368 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.892757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.933930 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.991113 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.991153 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.991172 1195787 retry.go:31] will retry after 590.272449ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.027415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:40.085906 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.089482 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.089515 1195787 retry.go:31] will retry after 1.798316027s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.392931 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.393007 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.393310 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:40.393376 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:40.582668 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:40.643859 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.643900 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.643941 1195787 retry.go:31] will retry after 1.196819353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.892768 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.392495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.841577 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:41.888099 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:41.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.901267 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.901306 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.901323 1195787 retry.go:31] will retry after 1.106575841s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948402 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.948447 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948500 1195787 retry.go:31] will retry after 1.314106681s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:42.393054 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.393195 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.393477 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:42.393524 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:42.893249 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.893594 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.008894 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:43.066157 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.066194 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.066212 1195787 retry.go:31] will retry after 2.952953914s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.263490 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:43.325047 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.325147 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.325201 1195787 retry.go:31] will retry after 2.165088511s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.892529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.892853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.392698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.892859 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:44.892927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:45.392615 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.393055 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:45.491313 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:45.548834 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:45.552259 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.552290 1195787 retry.go:31] will retry after 4.009218302s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.020180 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:46.081331 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:46.081373 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.081392 1195787 retry.go:31] will retry after 2.724964309s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.392810 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.392886 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.393216 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.893121 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.893451 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:46.893527 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:47.392312 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.392379 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.392690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:47.892435 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.892508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.806570 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:48.859925 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:48.863450 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.863523 1195787 retry.go:31] will retry after 5.125713123s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.892640 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.892710 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.892972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:49.392419 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.392858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:49.392930 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:49.562244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:49.616549 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:49.619912 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.619976 1195787 retry.go:31] will retry after 7.525324152s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.893380 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.893476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.893792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.392521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.892413 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.392501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:51.892896 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:52.392394 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:52.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.892886 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.892492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.990244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:54.052144 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:54.052189 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.052212 1195787 retry.go:31] will retry after 10.028215297s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:54.392879 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:54.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.892892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.892535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:56.892936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:57.146448 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:57.223873 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:57.223911 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.223929 1195787 retry.go:31] will retry after 7.68443688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.392441 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.392757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:57.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.892896 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.892496 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.892902 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:58.892976 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:59.392373 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.392729 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:59.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.392605 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.392823 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.393374 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.893180 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.893259 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.893590 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:00.893634 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:01.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.392775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:01.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.892560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.392629 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.393091 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.893045 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.893120 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.893431 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:03.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.393360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.393682 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:03.393752 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:03.892452 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.892588 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.081415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:04.149098 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.149154 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.149173 1195787 retry.go:31] will retry after 12.181474759s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.392826 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.892706 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.908952 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:04.983582 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.983679 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.983707 1195787 retry.go:31] will retry after 20.674508131s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:05.393152 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.393222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.393548 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:05.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:05.892840 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:06.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:06.892466 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.892581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.392796 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.392870 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.393185 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.893008 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.893099 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.893411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:07.893460 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:08.393200 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.393269 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.393580 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:08.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.392350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:10.392470 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.392542 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:10.392885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:10.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.392567 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.392748 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.392724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.892526 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.892600 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.892927 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:12.892994 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:13.392672 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.393083 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:13.892350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.892423 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:15.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.392797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:15.392868 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:15.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.331590 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:16.385966 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:16.389831 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.389871 1195787 retry.go:31] will retry after 10.81475415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.393176 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.393493 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.893314 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.893409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.893794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:17.392528 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.392670 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:17.393070 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:17.892892 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.892977 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.893319 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.393167 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.393296 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:19.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.392531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.393011 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:19.393093 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:19.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.892493 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.392457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.392540 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.892404 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.892505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.892840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.392336 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.392752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:21.892899 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:22.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.392824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:22.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.892650 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.892812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:24.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.392459 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.392739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:24.392822 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:24.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.392505 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.392578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.392919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.658449 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:25.718689 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:25.718787 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.718811 1195787 retry.go:31] will retry after 20.411460434s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.893032 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.893152 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.893496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:26.393268 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.393345 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.393658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:26.393735 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:26.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.205308 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:27.264744 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:27.264795 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.264821 1195787 retry.go:31] will retry after 26.872581906s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.393247 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.393343 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.393691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.392400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.392499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.892880 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:28.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:29.392435 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.392530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:29.892518 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.892615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.892959 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.392483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.892684 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:30.893088 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:31.392443 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.392538 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:31.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.392398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.892495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:33.392363 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.392786 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:33.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:33.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.892942 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:35.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:35.392919 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:35.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.892552 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.892909 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:37.392676 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.392747 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.393087 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:37.393163 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:37.892815 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.892918 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.893191 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.392972 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.393069 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.393395 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.893206 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.893283 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.893614 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.892914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:39.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:40.392478 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:40.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.392429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.892557 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.892907 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:42.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.392468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:42.392895 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.892798 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.893169 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.392965 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.393041 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.393425 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.893065 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.893192 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.893542 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:44.393319 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.393457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.393797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:44.393864 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:44.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.892887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.392407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.392691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.892831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.131350 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:46.207148 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:46.207192 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.207211 1195787 retry.go:31] will retry after 46.493082425s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.392632 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.393042 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.892356 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.892439 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:46.892784 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:47.392561 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.392655 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.393022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:47.892957 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.893052 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.393028 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.393151 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.393502 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.893231 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.893339 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:48.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:49.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.892410 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.892719 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.392567 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.392922 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.892639 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.892718 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.893017 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:51.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:51.392927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:51.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.392587 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.392689 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.393078 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.892911 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.893278 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:53.393104 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.393175 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.393506 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:53.393578 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:53.893174 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.893253 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.893635 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.138097 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:54.199604 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:54.199639 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.199657 1195787 retry.go:31] will retry after 32.999586692s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.392915 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.392997 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.393320 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.893151 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.893222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.893558 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:55.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.393372 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.393696 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:55.393771 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:55.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.392482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.392536 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:57.892898 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:58.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.392510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.392842 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.892427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.892690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.392771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:00.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.392852 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:00.392906 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:00.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.392554 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.392642 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.392932 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:02.392460 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:02.392937 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:02.893082 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.893161 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.893517 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.393213 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.393335 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.393710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.893298 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.893393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.893695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.392345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.392774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.892322 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.892394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:04.892799 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:05.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:05.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.892687 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.893007 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.392320 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.392389 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.392734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:06.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:07.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.392661 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.393015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:07.892841 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.892966 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.893308 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.393076 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.393143 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.393465 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.893245 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.893642 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:08.893706 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:09.392340 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:09.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.392603 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.393041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.892396 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.892674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:11.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.392788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:11.392854 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:11.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.892864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.392694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:13.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.392558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:13.392904 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:13.892358 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.392462 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.892493 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.892568 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.892916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.392728 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:15.892902 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:16.392592 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.393051 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:16.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.892766 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.392504 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.392612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.392914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.892926 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.893001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:17.893380 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:18.393186 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.393287 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.393589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:18.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.892436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.892724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.892501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:20.392577 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.392677 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:20.393034 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:20.892700 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.892780 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.893096 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.392884 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.392972 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.393246 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.893036 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.893115 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.893439 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:22.393105 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.393180 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.393531 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:22.393602 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:22.893283 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.893357 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.893654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.392360 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.392759 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.892410 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.892763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.392324 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.392671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.892367 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:24.892851 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:25.392405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.392832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:25.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.892598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:26.893059 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:27.199423 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:39:27.258871 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262674 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262814 1195787 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:27.393144 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.393338 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.393739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:27.892445 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.892515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.392686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.393001 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.892388 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:29.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:29.392759 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:29.892449 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.892575 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.892898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.892522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:31.392593 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.392698 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.393057 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:31.393175 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.892746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.393076 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.700811 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:39:32.759422 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759519 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759610 1195787 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:32.762569 1195787 out.go:179] * Enabled addons: 
	I1218 00:39:32.766452 1195787 addons.go:530] duration metric: took 1m55.398865574s for enable addons: enabled=[]
	I1218 00:39:32.892720 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.892834 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.893134 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:33.392885 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.392951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.393266 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:33.393360 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:33.893120 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.893193 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.893559 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.393379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.393480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.393839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.892868 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.392614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.892537 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:35.892942 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:36.392383 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.392802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:36.892719 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.892802 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.893187 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.393130 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.393210 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.393536 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.892374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:38.392658 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.392729 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:38.393158 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:38.892929 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.893353 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.393129 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.393197 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.393452 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.893242 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.893317 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.893673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.392517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.892369 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.892525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:40.892938 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:41.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.392798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:41.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.892875 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.392446 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.892629 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.892701 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.893027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:42.893091 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:43.392773 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.392853 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.393188 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:43.892986 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.893071 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.893334 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.393187 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.393481 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.893282 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.893350 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.893683 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:44.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:45.392327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.392700 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:45.892412 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.392830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.892694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:47.392541 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:47.392943 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:47.892913 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.892990 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.392935 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.393001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.393289 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.893086 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.893157 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.893470 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:49.393330 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.393420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.393740 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:49.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:49.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.892391 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.892665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.392473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.892447 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.392762 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.892406 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:51.892823 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:52.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:52.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.392422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.392736 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.892489 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.892612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:53.893004 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:54.392376 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:54.892366 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.892778 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:56.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:56.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:56.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.392473 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.892694 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.892772 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.892739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:58.892789 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:59.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.392532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:59.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.892633 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.392899 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.892419 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:00.892885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:01.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:01.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.892996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.392436 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.392515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.892688 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.892766 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:02.893136 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:03.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:03.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.392791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.892671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:05.392381 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:05.392855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:05.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:07.392557 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.392630 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.392948 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:07.393044 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:07.892947 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.893292 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.393116 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.393189 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.393503 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.893257 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.893330 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.893657 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:09.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.393365 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.393623 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:09.393667 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:09.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.892429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.392530 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.892422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.892749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.892544 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.892620 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:11.893010 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:12.392339 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.392715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:12.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.892814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:14.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.392813 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:14.392867 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:14.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.392448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.892562 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.892645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.892993 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:16.392714 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.392789 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.393108 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:16.393181 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:16.892981 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.893057 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.893318 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.393280 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.393362 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.393715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.392754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.892443 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.892891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:18.892951 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:19.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.392715 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.393048 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:19.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.392464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.392845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:21.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:21.392721 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:21.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.392527 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:23.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:23.392882 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:23.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.392684 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:25.392518 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.392622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.393004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:25.393058 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:25.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.892660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.392355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.892589 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.392982 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.892845 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.892928 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.893247 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:27.893296 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:28.392780 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.392854 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.393198 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:28.892980 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.893051 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.893305 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.393025 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.393097 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.393406 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.893167 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.893246 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.893569 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:29.893625 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:30.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.392419 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.392749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:30.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.892844 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.392817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.892407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:32.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.392882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:32.392936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:32.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.892506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.392470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.892473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:34.892850 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:35.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.392811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:35.892523 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.892622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.892978 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.392672 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.892754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.392604 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:37.393019 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:37.892964 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.893061 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.893356 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.393199 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.393280 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.393633 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.892784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.392710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.892802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:39.892855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:40.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:40.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.892649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.892534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:42.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.392664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:42.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.892597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.392651 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.392720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.393047 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.892468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.892851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:44.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.392461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:44.392853 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:44.892484 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.892559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.892919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.392680 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.892767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.892679 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:46.892729 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:47.392534 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.392616 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:47.893009 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.893086 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.893414 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.393173 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.393243 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.393496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.893310 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.893387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.893701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:48.893755 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:49.392365 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.392767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:49.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.892412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.392433 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:51.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.392457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:51.392745 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:51.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.392597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.392916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.892836 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.892908 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.893174 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:53.393008 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.393079 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.393392 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:53.393445 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:53.893170 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.893250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.893577 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.393256 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.393323 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.393624 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.892833 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.892909 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.893230 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.392807 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.392879 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.393195 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.892903 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.892983 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.893248 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:55.893295 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:56.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.393164 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.393494 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:56.893233 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.893303 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.893625 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.392308 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.892561 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.892640 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.893004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:58.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.392770 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:58.392830 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:58.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.392417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.392375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.392732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:00.892926 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:01.392600 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.392680 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.393027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:01.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.392459 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.392534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.392891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.892379 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:03.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.392673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:03.392717 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:03.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.892520 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.892803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:05.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:05.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:05.892501 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.892578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.892871 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.392458 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.392846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.892564 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.892651 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:07.392849 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.392927 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.393249 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:07.393300 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:07.893061 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.893141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.893407 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.393571 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.893213 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.893285 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.893586 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:09.393338 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.393409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.393737 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:09.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:09.892310 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.892384 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.892460 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.892531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.392821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.892452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:11.892845 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:12.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.392447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:12.892659 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.893041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.392855 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.892343 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:14.392354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.392431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:14.392815 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:14.892477 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.892893 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:16.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.392508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:16.392886 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:16.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.392663 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.393122 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.893034 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.893112 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.893434 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:18.393225 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.393298 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.393566 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:18.393619 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:18.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.892776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.392463 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.392545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:20.892875 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:21.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.392697 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:21.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.392426 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.892691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:23.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:23.392897 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:23.892549 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.892621 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.892930 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.392371 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:25.392487 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.392571 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.392908 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:25.392969 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:25.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.892701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.392465 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.392539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.892616 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.892694 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.892997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:27.893042 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:28.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:28.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.392438 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.892380 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:30.392322 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:30.392775 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:30.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.392484 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.392563 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.392894 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:32.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:32.392859 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:32.892645 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.892720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.893273 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.393053 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.393128 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.393383 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.893197 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.893271 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.893602 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.392329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.892409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:34.892764 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:35.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:35.892454 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.392716 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:36.892870 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:37.392455 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.392936 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:37.893011 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.893129 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.893427 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.393291 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.393628 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.893349 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.893718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:38.893795 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:39.393265 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.393337 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.393588 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:39.893331 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.893408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.893734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.892710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:41.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:41.392842 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:41.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.892453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.892740 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.893071 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:43.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.392714 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.393030 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:43.393087 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:43.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.892423 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.892741 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:45.892958 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:46.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:46.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.892695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.392510 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.392586 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.392951 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:48.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.392712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:48.392768 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:48.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.392787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.892646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:50.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:50.392909 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:50.892579 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.892652 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.892985 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.392709 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.392792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.892354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:52.892725 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:53.392414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.392831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:53.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:54.892908 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:55.392584 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.392665 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.392996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:55.892326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.892800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.392401 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.392474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.892944 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:56.892999 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:57.392855 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.392925 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:57.893155 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.893229 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.893537 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.393357 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.393431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.393713 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:59.392495 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.392587 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.392917 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:59.392973 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:59.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.893052 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:01.392764 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.392860 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:01.393227 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:01.892948 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.893270 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.393163 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.393444 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.892430 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.892742 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.892755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:03.892801 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:04.392448 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.392523 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:04.892336 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.892677 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:05.892848 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:06.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:06.892407 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.892929 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.392690 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.392765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.393085 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.893075 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.893147 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:07.893442 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:08.393253 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.393325 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.393644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:08.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.392738 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:10.392423 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:10.392894 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:10.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.392445 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.392851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.892553 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.892632 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.892973 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.392666 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:12.892876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:13.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.392815 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:13.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.392377 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:14.892953 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:15.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.392701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:15.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.392442 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.392833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.892667 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:17.392522 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.392590 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:17.392921 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:17.892652 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.892728 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.893037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.892817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.392472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.892338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:19.892782 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:20.392466 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:20.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:21.892849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:22.392532 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.392615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.392957 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:22.892669 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.892988 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.392497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.892580 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.892912 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:23.892972 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:24.392362 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.392433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.392780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:24.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.892471 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.392469 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.392543 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:26.392450 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.392522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.392876 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:26.392935 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:26.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.892686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.392919 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.392992 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.393253 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.893175 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.893249 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.893584 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.392338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.892451 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:28.892862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:29.392512 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.392870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:29.892578 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.892656 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.392314 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.893263 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.893356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.893732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:30.893797 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:31.392476 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.392560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:31.892342 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.892698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.392782 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:33.392366 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.392818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:33.392876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:33.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.892447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.392507 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.392934 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.892413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:35.392488 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:35.392932 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:35.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.892504 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.392359 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.392660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.393104 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:37.393155 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:37.892886 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.892951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.893194 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.392988 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.393067 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.393390 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.893201 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.893273 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.893589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:39.393344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.393415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.393662 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:39.393702 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:39.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.392451 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.392529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.392862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.892870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:42.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:42.892516 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.892611 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.892938 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.392638 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.392711 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.393037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.892699 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.892765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:43.893055 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:44.392565 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.392645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.392956 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:44.892656 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.892731 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.893058 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.392581 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.392846 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.393290 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.893123 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.893447 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:45.893502 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:46.393297 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.393390 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.393780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:46.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.892799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.392560 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.392631 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.392960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.892485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:48.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.392663 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:48.392703 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:48.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.392544 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.392943 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:50.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:50.392849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:50.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.392654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.892315 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.892388 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.892718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:52.392427 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:52.392889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:52.892324 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.892546 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.892874 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.392387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.892485 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.892882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:54.892940 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:55.392636 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.393066 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:55.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:57.392739 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.392818 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.393074 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:57.393125 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:57.892970 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.893044 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.893385 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.393560 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.893294 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.893360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.893621 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.892745 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:59.892790 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:00.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.392421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:00.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.892467 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.392584 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:02.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.392819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:02.392871 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:02.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.392644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.392425 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.892634 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:04.892673 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:05.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:05.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.892809 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.392658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:06.892843 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:07.392572 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.392646 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:07.892874 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.892944 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.893199 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.393002 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.393080 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.393411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.893209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.893286 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.893674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:08.893724 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:09.392325 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.392655 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:09.892370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.392431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:11.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:11.392883 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:11.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.892939 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.392370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.892624 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:13.392523 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.392903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:13.392963 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:13.892519 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.892431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.892510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.392321 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.892434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.892732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:15.892778 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:16.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:16.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.893341 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.893646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.392499 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.392579 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.892490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:17.892807 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:18.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:18.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.393074 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.393141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.393442 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.893090 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.893170 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.893422 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:19.893473 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:20.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.393284 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.393611 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:20.893284 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.893361 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:22.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:22.392844 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:22.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.892425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.892708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.392485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.392394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.392659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.892838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:24.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:25.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:25.892357 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.892457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.892775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.392472 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.892582 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.892678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.893022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:26.893073 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:27.392724 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.392791 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.393032 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:27.893016 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.893101 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.893433 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.393326 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.892772 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:29.392456 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.392869 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:29.392915 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:29.892602 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.892673 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.892440 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.392566 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.892417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:31.892716 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:32.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.392776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:32.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.892858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:33.392332 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.397972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1218 00:43:33.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:33.892918 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:34.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:34.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.892442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.892725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.392438 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.392511 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.892534 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.892662 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:35.893039 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:36.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:36.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.392739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.892912 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.892985 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.893251 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:37.893290 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:38.393068 1195787 node_ready.go:38] duration metric: took 6m0.000870722s for node "functional-288604" to be "Ready" ...
	I1218 00:43:38.396243 1195787 out.go:203] 
	W1218 00:43:38.399208 1195787 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:43:38.399223 1195787 out.go:285] * 
	W1218 00:43:38.401353 1195787 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:43:38.404386 1195787 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409510412Z" level=info msg="Using the internal default seccomp profile"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409547589Z" level=info msg="AppArmor is disabled by the system or at CRI-O build-time"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409556253Z" level=info msg="No blockio config file specified, blockio not configured"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409562595Z" level=info msg="RDT not available in the host system"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.409584781Z" level=info msg="Using conmon executable: /usr/libexec/crio/conmon"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.410490591Z" level=info msg="Conmon does support the --sync option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.410514878Z" level=info msg="Conmon does support the --log-global-size-max option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.410537728Z" level=info msg="Using conmon executable: /usr/libexec/crio/conmon"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.411154759Z" level=info msg="Conmon does support the --sync option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.411175239Z" level=info msg="Conmon does support the --log-global-size-max option"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.411325043Z" level=info msg="Updated default CNI network name to "
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.4119085Z" level=info msg="Current CRI-O configuration:\n[crio]\n  root = \"/var/lib/containers/storage\"\n  runroot = \"/run/containers/storage\"\n  imagestore = \"\"\n  storage_driver = \"overlay\"\n  log_dir = \"/var/log/crio/pods\"\n  version_file = \"/var/run/crio/version\"\n  version_file_persist = \"\"\n  clean_shutdown_file = \"/var/lib/crio/clean.shutdown\"\n  internal_wipe = true\n  internal_repair = true\n  [crio.api]\n    grpc_max_send_msg_size = 83886080\n    grpc_max_recv_msg_size = 83886080\n    listen = \"/var/run/crio/crio.sock\"\n    stream_address = \"127.0.0.1\"\n    stream_port = \"0\"\n    stream_enable_tls = false\n    stream_tls_cert = \"\"\n    stream_tls_key = \"\"\n    stream_tls_ca = \"\"\n    stream_idle_timeout = \"\"\n  [crio.runtime]\n    no_pivot = false\n    selinux = false\n    log_to_journald = false\n    drop_infra_ctr = true\n    read_only = false\n    hooks_dir = [\"/usr/share/containers/oci/
hooks.d\"]\n    default_capabilities = [\"CHOWN\", \"DAC_OVERRIDE\", \"FSETID\", \"FOWNER\", \"SETGID\", \"SETUID\", \"SETPCAP\", \"NET_BIND_SERVICE\", \"KILL\"]\n    add_inheritable_capabilities = false\n    default_sysctls = [\"net.ipv4.ip_unprivileged_port_start=0\"]\n    allowed_devices = [\"/dev/fuse\", \"/dev/net/tun\"]\n    cdi_spec_dirs = [\"/etc/cdi\", \"/var/run/cdi\"]\n    device_ownership_from_security_context = false\n    default_runtime = \"crun\"\n    decryption_keys_path = \"/etc/crio/keys/\"\n    conmon = \"\"\n    conmon_cgroup = \"pod\"\n    seccomp_profile = \"\"\n    privileged_seccomp_profile = \"\"\n    apparmor_profile = \"crio-default\"\n    blockio_config_file = \"\"\n    blockio_reload = false\n    irqbalance_config_file = \"/etc/sysconfig/irqbalance\"\n    rdt_config_file = \"\"\n    cgroup_manager = \"cgroupfs\"\n    default_mounts_file = \"\"\n    container_exits_dir = \"/var/run/crio/exits\"\n    container_attach_socket_dir = \"/var/run/crio\"\n    bind_mount_prefix = \"\"\n
uid_mappings = \"\"\n    minimum_mappable_uid = -1\n    gid_mappings = \"\"\n    minimum_mappable_gid = -1\n    log_level = \"info\"\n    log_filter = \"\"\n    namespaces_dir = \"/var/run\"\n    pinns_path = \"/usr/bin/pinns\"\n    enable_criu_support = false\n    pids_limit = -1\n    log_size_max = -1\n    ctr_stop_timeout = 30\n    separate_pull_cgroup = \"\"\n    infra_ctr_cpuset = \"\"\n    shared_cpuset = \"\"\n    enable_pod_events = false\n    irqbalance_config_restore_file = \"/etc/sysconfig/orig_irq_banned_cpus\"\n    hostnetwork_disable_selinux = true\n    disable_hostport_mapping = false\n    timezone = \"\"\n    [crio.runtime.runtimes]\n      [crio.runtime.runtimes.crun]\n        runtime_config_path = \"\"\n        runtime_path = \"/usr/libexec/crio/crun\"\n        runtime_type = \"\"\n        runtime_root = \"/run/crun\"\n        allowed_annotations = [\"io.containers.trace-syscall\"]\n        monitor_path = \"/usr/libexec/crio/conmon\"\n        monitor_cgroup = \"pod\"\n        container_min_me
mory = \"12MiB\"\n        no_sync_log = false\n      [crio.runtime.runtimes.runc]\n        runtime_config_path = \"\"\n        runtime_path = \"/usr/libexec/crio/runc\"\n        runtime_type = \"\"\n        runtime_root = \"/run/runc\"\n        monitor_path = \"/usr/libexec/crio/conmon\"\n        monitor_cgroup = \"pod\"\n        container_min_memory = \"12MiB\"\n        no_sync_log = false\n  [crio.image]\n    default_transport = \"docker://\"\n    global_auth_file = \"\"\n    namespaced_auth_dir = \"/etc/crio/auth\"\n    pause_image = \"registry.k8s.io/pause:3.10.1\"\n    pause_image_auth_file = \"\"\n    pause_command = \"/pause\"\n    signature_policy = \"/etc/crio/policy.json\"\n    signature_policy_dir = \"/etc/crio/policies\"\n    image_volumes = \"mkdir\"\n    big_files_temporary_dir = \"\"\n    auto_reload_registries = false\n    pull_progress_timeout = \"0s\"\n    oci_artifact_mount_support = true\n    short_name_mode = \"enforcing\"\n  [crio.network]\n    cni_default_network = \"\"\n    network_dir
= \"/etc/cni/net.d/\"\n    plugin_dirs = [\"/opt/cni/bin/\"]\n  [crio.metrics]\n    enable_metrics = false\n    metrics_collectors = [\"image_pulls_layer_size\", \"containers_events_dropped_total\", \"containers_oom_total\", \"processes_defunct\", \"operations_total\", \"operations_latency_seconds\", \"operations_latency_seconds_total\", \"operations_errors_total\", \"image_pulls_bytes_total\", \"image_pulls_skipped_bytes_total\", \"image_pulls_failure_total\", \"image_pulls_success_total\", \"image_layer_reuse_total\", \"containers_oom_count_total\", \"containers_seccomp_notifier_count_total\", \"resources_stalled_at_stage\", \"containers_stopped_monitor_count\"]\n    metrics_host = \"127.0.0.1\"\n    metrics_port = 9090\n    metrics_socket = \"\"\n    metrics_cert = \"\"\n    metrics_key = \"\"\n  [crio.tracing]\n    enable_tracing = false\n    tracing_endpoint = \"127.0.0.1:4317\"\n    tracing_sampling_rate_per_million = 0\n  [crio.stats]\n    stats_collection_period = 0\n    collection_period = 0\n  [cri
o.nri]\n    enable_nri = true\n    nri_listen = \"/var/run/nri/nri.sock\"\n    nri_plugin_dir = \"/opt/nri/plugins\"\n    nri_plugin_config_dir = \"/etc/nri/conf.d\"\n    nri_plugin_registration_timeout = \"5s\"\n    nri_plugin_request_timeout = \"2s\"\n    nri_disable_connections = false\n    [crio.nri.default_validator]\n      nri_enable_default_validator = false\n      nri_validator_reject_oci_hook_adjustment = false\n      nri_validator_reject_runtime_default_seccomp_adjustment = false\n      nri_validator_reject_unconfined_seccomp_adjustment = false\n      nri_validator_reject_custom_seccomp_adjustment = false\n      nri_validator_reject_namespace_adjustment = false\n      nri_validator_tolerate_missing_plugins_annotation = \"\"\n"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.412366742Z" level=info msg="Attempting to restore irqbalance config from /etc/sysconfig/orig_irq_banned_cpus"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.412420673Z" level=info msg="Restore irqbalance config: failed to get current CPU ban list, ignoring"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477189227Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477239974Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477311602Z" level=info msg="Create NRI interface"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477429572Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477452775Z" level=info msg="runtime interface created"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477466354Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.47747908Z" level=info msg="runtime interface starting up..."
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477484955Z" level=info msg="starting plugins..."
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477502275Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:37:35 functional-288604 crio[5385]: time="2025-12-18T00:37:35.477588312Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:37:35 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:43:42.999406    8753 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:43.000336    8753 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:43.001545    8753 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:43.002254    8753 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:43.003895    8753 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:43:43 up  7:26,  0 user,  load average: 0.34, 0.22, 0.59
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1141.
	Dec 18 00:43:40 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:40 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:40 functional-288604 kubelet[8629]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:40 functional-288604 kubelet[8629]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:40 functional-288604 kubelet[8629]: E1218 00:43:40.971091    8629 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:40 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:41 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1142.
	Dec 18 00:43:41 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:41 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:41 functional-288604 kubelet[8649]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:41 functional-288604 kubelet[8649]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:41 functional-288604 kubelet[8649]: E1218 00:43:41.704103    8649 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:41 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:41 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:42 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1143.
	Dec 18 00:43:42 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:42 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:42 functional-288604 kubelet[8669]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:42 functional-288604 kubelet[8669]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:42 functional-288604 kubelet[8669]: E1218 00:43:42.450055    8669 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:42 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:42 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (374.835096ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods (2.64s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd (2.35s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 kubectl -- --context functional-288604 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 kubectl -- --context functional-288604 get pods: exit status 1 (105.266685ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-288604 kubectl -- --context functional-288604 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (320.303545ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 logs -n 25: (1.003583884s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                      ARGS                                                                       │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-240845 image ls --format json --alsologtostderr                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh     │ functional-240845 ssh pgrep buildkitd                                                                                                           │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image   │ functional-240845 image ls --format yaml --alsologtostderr                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                          │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format table --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format short --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete  │ -p functional-240845                                                                                                                            │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start   │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ start   │ -p functional-288604 --alsologtostderr -v=8                                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:37 UTC │                     │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.1                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.3                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:latest                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add minikube-local-cache-test:functional-288604                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache delete minikube-local-cache-test:functional-288604                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ list                                                                                                                                            │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl images                                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                              │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ cache   │ functional-288604 cache reload                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ kubectl │ functional-288604 kubectl -- --context functional-288604 get pods                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:37:32
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:37:32.486183 1195787 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:37:32.486610 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486624 1195787 out.go:374] Setting ErrFile to fd 2...
	I1218 00:37:32.486629 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486918 1195787 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:37:32.487313 1195787 out.go:368] Setting JSON to false
	I1218 00:37:32.488152 1195787 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26401,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:37:32.488255 1195787 start.go:143] virtualization:  
	I1218 00:37:32.491971 1195787 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:37:32.494842 1195787 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:37:32.494944 1195787 notify.go:221] Checking for updates...
	I1218 00:37:32.500434 1195787 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:37:32.503311 1195787 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:32.506071 1195787 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:37:32.508979 1195787 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:37:32.511873 1195787 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:37:32.515326 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:32.515476 1195787 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:37:32.549560 1195787 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:37:32.549709 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.608968 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.600331572 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.609068 1195787 docker.go:319] overlay module found
	I1218 00:37:32.612053 1195787 out.go:179] * Using the docker driver based on existing profile
	I1218 00:37:32.614859 1195787 start.go:309] selected driver: docker
	I1218 00:37:32.614879 1195787 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.614985 1195787 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:37:32.615081 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.681718 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.67244891 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.682130 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:32.682189 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:32.682255 1195787 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.687138 1195787 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:37:32.690134 1195787 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:37:32.693078 1195787 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:37:32.696069 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:32.696123 1195787 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:37:32.696143 1195787 cache.go:65] Caching tarball of preloaded images
	I1218 00:37:32.696183 1195787 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:37:32.696303 1195787 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:37:32.696317 1195787 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:37:32.696417 1195787 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:37:32.714975 1195787 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:37:32.714995 1195787 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:37:32.715013 1195787 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:37:32.715043 1195787 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:37:32.715099 1195787 start.go:364] duration metric: took 33.796µs to acquireMachinesLock for "functional-288604"
	I1218 00:37:32.715121 1195787 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:37:32.715131 1195787 fix.go:54] fixHost starting: 
	I1218 00:37:32.715395 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:32.731575 1195787 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:37:32.731606 1195787 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:37:32.734910 1195787 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:37:32.734955 1195787 machine.go:94] provisionDockerMachine start ...
	I1218 00:37:32.735034 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.751418 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.751747 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.751760 1195787 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:37:32.904326 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:32.904350 1195787 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:37:32.904413 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.933199 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.933525 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.933536 1195787 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:37:33.096692 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:33.096816 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.115124 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.115445 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.115466 1195787 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:37:33.272592 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:37:33.272617 1195787 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:37:33.272637 1195787 ubuntu.go:190] setting up certificates
	I1218 00:37:33.272647 1195787 provision.go:84] configureAuth start
	I1218 00:37:33.272712 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:33.291737 1195787 provision.go:143] copyHostCerts
	I1218 00:37:33.291803 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291863 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:37:33.291880 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291977 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:37:33.292105 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292127 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:37:33.292137 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292177 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:37:33.292274 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292300 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:37:33.292315 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292347 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:37:33.292433 1195787 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:37:33.397529 1195787 provision.go:177] copyRemoteCerts
	I1218 00:37:33.397646 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:37:33.397692 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.416603 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:33.523879 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:37:33.523950 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:37:33.540143 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:37:33.540204 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:37:33.557091 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:37:33.557194 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:37:33.573937 1195787 provision.go:87] duration metric: took 301.27685ms to configureAuth
	I1218 00:37:33.573963 1195787 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:37:33.574138 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:33.574247 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.591351 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.591663 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.591676 1195787 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:37:33.932454 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:37:33.932478 1195787 machine.go:97] duration metric: took 1.197515142s to provisionDockerMachine
	I1218 00:37:33.932490 1195787 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:37:33.932503 1195787 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:37:33.932581 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:37:33.932636 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.953296 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.060199 1195787 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:37:34.063627 1195787 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:37:34.063655 1195787 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:37:34.063660 1195787 command_runner.go:130] > VERSION_ID="12"
	I1218 00:37:34.063664 1195787 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:37:34.063680 1195787 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:37:34.063684 1195787 command_runner.go:130] > ID=debian
	I1218 00:37:34.063689 1195787 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:37:34.063694 1195787 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:37:34.063700 1195787 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:37:34.063783 1195787 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:37:34.063800 1195787 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:37:34.063810 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:37:34.063871 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:37:34.063955 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:37:34.063966 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:37:34.064048 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:37:34.064056 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:37:34.064100 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:37:34.071756 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:34.089207 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:37:34.106978 1195787 start.go:296] duration metric: took 174.472072ms for postStartSetup
	I1218 00:37:34.107054 1195787 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:37:34.107096 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.124265 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.224786 1195787 command_runner.go:130] > 12%
	I1218 00:37:34.224858 1195787 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:37:34.228879 1195787 command_runner.go:130] > 171G
	I1218 00:37:34.229324 1195787 fix.go:56] duration metric: took 1.514188493s for fixHost
	I1218 00:37:34.229353 1195787 start.go:83] releasing machines lock for "functional-288604", held for 1.514233177s
	I1218 00:37:34.229425 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:34.246154 1195787 ssh_runner.go:195] Run: cat /version.json
	I1218 00:37:34.246206 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.246451 1195787 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:37:34.246509 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.266363 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.276260 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.371623 1195787 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:37:34.371754 1195787 ssh_runner.go:195] Run: systemctl --version
	I1218 00:37:34.461010 1195787 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:37:34.461057 1195787 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:37:34.461077 1195787 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:37:34.461152 1195787 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:37:34.497659 1195787 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:37:34.501645 1195787 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:37:34.502005 1195787 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:37:34.502070 1195787 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:37:34.509755 1195787 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:37:34.509780 1195787 start.go:496] detecting cgroup driver to use...
	I1218 00:37:34.509811 1195787 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:37:34.509875 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:37:34.523916 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:37:34.536646 1195787 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:37:34.536736 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:37:34.551504 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:37:34.564054 1195787 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:37:34.675890 1195787 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:37:34.798642 1195787 docker.go:234] disabling docker service ...
	I1218 00:37:34.798703 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:37:34.813006 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:37:34.825087 1195787 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:37:34.942798 1195787 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:37:35.067868 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:37:35.088600 1195787 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:37:35.102366 1195787 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:37:35.103752 1195787 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:37:35.103819 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.113147 1195787 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:37:35.113241 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.122530 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.131393 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.140799 1195787 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:37:35.148737 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.157396 1195787 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.165643 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.174650 1195787 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:37:35.181215 1195787 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:37:35.182122 1195787 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:37:35.189136 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.306446 1195787 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:37:35.483449 1195787 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:37:35.483550 1195787 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:37:35.487145 1195787 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:37:35.487172 1195787 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:37:35.487179 1195787 command_runner.go:130] > Device: 0,72	Inode: 1642        Links: 1
	I1218 00:37:35.487186 1195787 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:35.487202 1195787 command_runner.go:130] > Access: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487220 1195787 command_runner.go:130] > Modify: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487225 1195787 command_runner.go:130] > Change: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487229 1195787 command_runner.go:130] >  Birth: -
	I1218 00:37:35.487254 1195787 start.go:564] Will wait 60s for crictl version
	I1218 00:37:35.487306 1195787 ssh_runner.go:195] Run: which crictl
	I1218 00:37:35.490344 1195787 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:37:35.490683 1195787 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:37:35.512944 1195787 command_runner.go:130] > Version:  0.1.0
	I1218 00:37:35.513232 1195787 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:37:35.513363 1195787 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:37:35.513391 1195787 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:37:35.515559 1195787 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:37:35.515677 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.541522 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.541589 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.541609 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.541630 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.541651 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.541672 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.541692 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.541720 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.541741 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.541768 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.541786 1195787 command_runner.go:130] >      static
	I1218 00:37:35.541805 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.541829 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.541856 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.541889 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.541915 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.541933 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.541952 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.541983 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.541999 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.543191 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.569029 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.569102 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.569122 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.569144 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.569164 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.569191 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.569210 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.569239 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.569267 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.569285 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.569302 1195787 command_runner.go:130] >      static
	I1218 00:37:35.569320 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.569347 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.569366 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.569384 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.569405 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.569429 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.569449 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.569467 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.569485 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.575974 1195787 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:37:35.578737 1195787 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:37:35.594362 1195787 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:37:35.598161 1195787 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:37:35.598363 1195787 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFir
mwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:37:35.598485 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:35.598543 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.635547 1195787 command_runner.go:130] > {
	I1218 00:37:35.635578 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.635584 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635591 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.635596 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635602 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.635605 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635609 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635623 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.635631 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.635634 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635639 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.635643 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635650 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635654 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635657 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635668 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.635672 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635677 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.635680 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635684 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635693 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.635701 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.635704 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635709 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.635712 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635719 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635723 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635731 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.635735 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635740 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.635743 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635747 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635758 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.635773 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.635777 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635781 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.635786 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.635790 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635793 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635795 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635802 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.635805 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635810 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.635815 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635823 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635830 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.635838 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.635841 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635845 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.635848 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635852 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635855 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635864 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635868 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635872 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635875 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635881 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.635885 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635890 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.635893 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635897 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635905 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.635912 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.635915 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635920 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.635926 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635930 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635934 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635938 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635941 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635944 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635947 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635954 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.635957 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635963 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.635966 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635970 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635978 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.635986 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.635989 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635993 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.635997 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636000 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636003 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636007 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636011 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636013 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636016 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636022 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.636027 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636032 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.636035 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636039 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636046 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.636054 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.636057 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636060 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.636064 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636073 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636077 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636079 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636086 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.636090 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636095 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.636098 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636102 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636110 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.636125 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.636133 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636137 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.636140 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636144 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636147 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636151 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636154 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636158 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636160 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636166 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.636170 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636175 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.636178 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636182 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636190 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.636197 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.636200 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636204 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.636208 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636211 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.636214 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636238 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636243 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.636251 1195787 command_runner.go:130] >     }
	I1218 00:37:35.636254 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.636256 1195787 command_runner.go:130] > }
	I1218 00:37:35.636431 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.636439 1195787 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:37:35.636495 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.658094 1195787 command_runner.go:130] > {
	I1218 00:37:35.658111 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.658115 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658124 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.658128 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658134 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.658137 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658141 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658151 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.658159 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.658163 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658167 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.658171 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658176 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658179 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658182 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658189 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.658192 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658198 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.658201 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658205 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658213 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.658222 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.658225 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658229 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.658233 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658242 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658250 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658262 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658269 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.658273 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658279 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.658282 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658286 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658294 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.658302 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.658305 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658309 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.658313 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.658317 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658321 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658323 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658330 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.658334 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658339 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.658344 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658348 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658356 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.658367 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.658370 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658374 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.658378 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658382 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658384 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658393 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658397 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658400 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658403 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658410 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.658413 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658425 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.658431 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658435 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658443 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.658455 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.658465 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658469 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.658472 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658476 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658479 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658483 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658487 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658490 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658493 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658499 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.658503 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658508 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.658511 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658515 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658523 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.658532 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.658535 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658539 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.658543 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658549 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658552 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658556 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658560 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658563 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658566 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658572 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.658577 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658582 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.658589 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658598 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658605 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.658613 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.658616 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658620 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.658624 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658628 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658631 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658642 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658650 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.658653 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658659 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.658662 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658666 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658677 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.658694 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.658697 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658701 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.658705 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658708 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658711 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658715 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658718 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658721 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658731 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.658734 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658739 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.658742 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658746 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658754 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.658761 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.658772 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658777 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.658781 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658784 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.658788 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658791 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658794 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.658798 1195787 command_runner.go:130] >     }
	I1218 00:37:35.658800 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.658803 1195787 command_runner.go:130] > }
	I1218 00:37:35.660205 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.660262 1195787 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:37:35.660279 1195787 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:37:35.660385 1195787 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:37:35.660470 1195787 ssh_runner.go:195] Run: crio config
	I1218 00:37:35.707278 1195787 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:37:35.707300 1195787 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:37:35.707307 1195787 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:37:35.707310 1195787 command_runner.go:130] > #
	I1218 00:37:35.707318 1195787 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:37:35.707324 1195787 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:37:35.707330 1195787 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:37:35.707346 1195787 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:37:35.707349 1195787 command_runner.go:130] > # reload'.
	I1218 00:37:35.707356 1195787 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:37:35.707362 1195787 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:37:35.707368 1195787 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:37:35.707383 1195787 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:37:35.707387 1195787 command_runner.go:130] > [crio]
	I1218 00:37:35.707393 1195787 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:37:35.707398 1195787 command_runner.go:130] > # containers images, in this directory.
	I1218 00:37:35.707595 1195787 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:37:35.707607 1195787 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:37:35.707620 1195787 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:37:35.707627 1195787 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:37:35.707631 1195787 command_runner.go:130] > # imagestore = ""
	I1218 00:37:35.707637 1195787 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:37:35.707643 1195787 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:37:35.707768 1195787 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:37:35.707777 1195787 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:37:35.707784 1195787 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:37:35.707788 1195787 command_runner.go:130] > # storage_option = [
	I1218 00:37:35.707935 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.707952 1195787 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:37:35.707959 1195787 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:37:35.707971 1195787 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:37:35.707978 1195787 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:37:35.707984 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:37:35.707990 1195787 command_runner.go:130] > # always happen on a node reboot
	I1218 00:37:35.708138 1195787 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:37:35.708160 1195787 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:37:35.708174 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:37:35.708183 1195787 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:37:35.708342 1195787 command_runner.go:130] > # version_file_persist = ""
	I1218 00:37:35.708354 1195787 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:37:35.708363 1195787 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:37:35.708367 1195787 command_runner.go:130] > # internal_wipe = true
	I1218 00:37:35.708381 1195787 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:37:35.708388 1195787 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:37:35.708503 1195787 command_runner.go:130] > # internal_repair = true
	I1218 00:37:35.708512 1195787 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:37:35.708519 1195787 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:37:35.708525 1195787 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:37:35.708671 1195787 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:37:35.708682 1195787 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:37:35.708686 1195787 command_runner.go:130] > [crio.api]
	I1218 00:37:35.708706 1195787 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:37:35.708833 1195787 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:37:35.708843 1195787 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:37:35.708997 1195787 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:37:35.709007 1195787 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:37:35.709013 1195787 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:37:35.709016 1195787 command_runner.go:130] > # stream_port = "0"
	I1218 00:37:35.709022 1195787 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:37:35.709150 1195787 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:37:35.709160 1195787 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:37:35.709282 1195787 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:37:35.709292 1195787 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:37:35.709298 1195787 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709420 1195787 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:37:35.709430 1195787 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:37:35.709436 1195787 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709440 1195787 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:37:35.709447 1195787 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:37:35.709453 1195787 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:37:35.709462 1195787 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:37:35.709593 1195787 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:37:35.709614 1195787 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709735 1195787 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:37:35.709746 1195787 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709864 1195787 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:37:35.709875 1195787 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:37:35.709881 1195787 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:37:35.709885 1195787 command_runner.go:130] > [crio.runtime]
	I1218 00:37:35.709891 1195787 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:37:35.709896 1195787 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:37:35.709907 1195787 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:37:35.709913 1195787 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:37:35.709917 1195787 command_runner.go:130] > # default_ulimits = [
	I1218 00:37:35.710017 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710026 1195787 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:37:35.710154 1195787 command_runner.go:130] > # no_pivot = false
	I1218 00:37:35.710163 1195787 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:37:35.710170 1195787 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:37:35.710300 1195787 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:37:35.710309 1195787 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:37:35.710323 1195787 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:37:35.710336 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710476 1195787 command_runner.go:130] > # conmon = ""
	I1218 00:37:35.710485 1195787 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:37:35.710492 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:37:35.710496 1195787 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:37:35.710508 1195787 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:37:35.710514 1195787 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:37:35.710521 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710524 1195787 command_runner.go:130] > # conmon_env = [
	I1218 00:37:35.710624 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710633 1195787 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:37:35.710639 1195787 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:37:35.710644 1195787 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:37:35.710648 1195787 command_runner.go:130] > # default_env = [
	I1218 00:37:35.710790 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710800 1195787 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:37:35.710816 1195787 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:37:35.710953 1195787 command_runner.go:130] > # selinux = false
	I1218 00:37:35.710964 1195787 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:37:35.710972 1195787 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:37:35.710977 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.710981 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.710993 1195787 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:37:35.710999 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711131 1195787 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:37:35.711142 1195787 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:37:35.711149 1195787 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:37:35.711162 1195787 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:37:35.711169 1195787 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:37:35.711174 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711345 1195787 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:37:35.711373 1195787 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:37:35.711401 1195787 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:37:35.711419 1195787 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:37:35.711456 1195787 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:37:35.711477 1195787 command_runner.go:130] > # blockio parameters.
	I1218 00:37:35.711667 1195787 command_runner.go:130] > # blockio_reload = false
	I1218 00:37:35.711706 1195787 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:37:35.711725 1195787 command_runner.go:130] > # irqbalance daemon.
	I1218 00:37:35.711743 1195787 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:37:35.711776 1195787 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:37:35.711801 1195787 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:37:35.711821 1195787 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:37:35.711855 1195787 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:37:35.711879 1195787 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:37:35.711898 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.712052 1195787 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:37:35.712092 1195787 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:37:35.712112 1195787 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:37:35.712133 1195787 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:37:35.712151 1195787 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:37:35.712187 1195787 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:37:35.712206 1195787 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:37:35.712253 1195787 command_runner.go:130] > # will be added.
	I1218 00:37:35.712276 1195787 command_runner.go:130] > # default_capabilities = [
	I1218 00:37:35.712420 1195787 command_runner.go:130] > # 	"CHOWN",
	I1218 00:37:35.712461 1195787 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:37:35.712541 1195787 command_runner.go:130] > # 	"FSETID",
	I1218 00:37:35.712631 1195787 command_runner.go:130] > # 	"FOWNER",
	I1218 00:37:35.712660 1195787 command_runner.go:130] > # 	"SETGID",
	I1218 00:37:35.712794 1195787 command_runner.go:130] > # 	"SETUID",
	I1218 00:37:35.712896 1195787 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:37:35.712994 1195787 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:37:35.713065 1195787 command_runner.go:130] > # 	"KILL",
	I1218 00:37:35.713149 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.713172 1195787 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:37:35.713258 1195787 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:37:35.713410 1195787 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:37:35.713489 1195787 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:37:35.713545 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.713716 1195787 command_runner.go:130] > default_sysctls = [
	I1218 00:37:35.713734 1195787 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:37:35.713949 1195787 command_runner.go:130] > ]
	I1218 00:37:35.713959 1195787 command_runner.go:130] > # List of devices on the host that a
	I1218 00:37:35.713966 1195787 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:37:35.713970 1195787 command_runner.go:130] > # allowed_devices = [
	I1218 00:37:35.713995 1195787 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:37:35.714000 1195787 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:37:35.714003 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714008 1195787 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:37:35.714016 1195787 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:37:35.714022 1195787 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:37:35.714028 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.714032 1195787 command_runner.go:130] > # additional_devices = [
	I1218 00:37:35.714035 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714040 1195787 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:37:35.714044 1195787 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:37:35.714048 1195787 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:37:35.714052 1195787 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:37:35.714056 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714062 1195787 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:37:35.714068 1195787 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:37:35.714077 1195787 command_runner.go:130] > # Defaults to false.
	I1218 00:37:35.714083 1195787 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:37:35.714089 1195787 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:37:35.714100 1195787 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:37:35.714537 1195787 command_runner.go:130] > # hooks_dir = [
	I1218 00:37:35.714675 1195787 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:37:35.714791 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.715258 1195787 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:37:35.715414 1195787 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:37:35.715601 1195787 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:37:35.715650 1195787 command_runner.go:130] > #
	I1218 00:37:35.715843 1195787 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:37:35.715943 1195787 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:37:35.716060 1195787 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:37:35.716083 1195787 command_runner.go:130] > #
	I1218 00:37:35.716111 1195787 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:37:35.716131 1195787 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:37:35.716166 1195787 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:37:35.716187 1195787 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:37:35.716204 1195787 command_runner.go:130] > #
	I1218 00:37:35.716248 1195787 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:37:35.716275 1195787 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:37:35.716306 1195787 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:37:35.717368 1195787 command_runner.go:130] > # pids_limit = -1
	I1218 00:37:35.717418 1195787 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:37:35.717442 1195787 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:37:35.717463 1195787 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:37:35.717499 1195787 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:37:35.717521 1195787 command_runner.go:130] > # log_size_max = -1
	I1218 00:37:35.717693 1195787 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:37:35.717720 1195787 command_runner.go:130] > # log_to_journald = false
	I1218 00:37:35.717752 1195787 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:37:35.717776 1195787 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:37:35.717810 1195787 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:37:35.717835 1195787 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:37:35.717855 1195787 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:37:35.717888 1195787 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:37:35.717911 1195787 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:37:35.717929 1195787 command_runner.go:130] > # read_only = false
	I1218 00:37:35.717949 1195787 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:37:35.717978 1195787 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:37:35.717999 1195787 command_runner.go:130] > # live configuration reload.
	I1218 00:37:35.718017 1195787 command_runner.go:130] > # log_level = "info"
	I1218 00:37:35.718039 1195787 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:37:35.718073 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.718091 1195787 command_runner.go:130] > # log_filter = ""
	I1218 00:37:35.718112 1195787 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718144 1195787 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:37:35.718167 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718189 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718221 1195787 command_runner.go:130] > # uid_mappings = ""
	I1218 00:37:35.718243 1195787 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718262 1195787 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:37:35.718280 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718311 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718343 1195787 command_runner.go:130] > # gid_mappings = ""
	I1218 00:37:35.718363 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:37:35.718395 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718420 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718442 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718481 1195787 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:37:35.718507 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:37:35.718529 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718561 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718589 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718607 1195787 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:37:35.718641 1195787 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:37:35.718665 1195787 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:37:35.718685 1195787 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:37:35.718717 1195787 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:37:35.718741 1195787 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:37:35.718762 1195787 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:37:35.718793 1195787 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:37:35.718814 1195787 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:37:35.718831 1195787 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:37:35.718851 1195787 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:37:35.718882 1195787 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:37:35.718907 1195787 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:37:35.718931 1195787 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:37:35.718965 1195787 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:37:35.718989 1195787 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:37:35.719009 1195787 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:37:35.719039 1195787 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:37:35.719315 1195787 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:37:35.719348 1195787 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:37:35.719365 1195787 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:37:35.719396 1195787 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:37:35.719423 1195787 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:37:35.719450 1195787 command_runner.go:130] > # pinns_path = ""
	I1218 00:37:35.719484 1195787 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:37:35.719510 1195787 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:37:35.719528 1195787 command_runner.go:130] > # enable_criu_support = true
	I1218 00:37:35.719563 1195787 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:37:35.719586 1195787 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:37:35.719602 1195787 command_runner.go:130] > # enable_pod_events = false
	I1218 00:37:35.719622 1195787 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:37:35.719651 1195787 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:37:35.719672 1195787 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:37:35.719690 1195787 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:37:35.719711 1195787 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:37:35.719747 1195787 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:37:35.719770 1195787 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:37:35.719795 1195787 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:37:35.719826 1195787 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:37:35.719849 1195787 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:37:35.719865 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.719885 1195787 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:37:35.719916 1195787 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:37:35.719938 1195787 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:37:35.719957 1195787 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:37:35.719973 1195787 command_runner.go:130] > #
	I1218 00:37:35.720002 1195787 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:37:35.720024 1195787 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:37:35.720041 1195787 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:37:35.720059 1195787 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:37:35.720096 1195787 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:37:35.720121 1195787 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:37:35.720139 1195787 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:37:35.720170 1195787 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:37:35.720190 1195787 command_runner.go:130] > # monitor_env = []
	I1218 00:37:35.720207 1195787 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:37:35.720256 1195787 command_runner.go:130] > # allowed_annotations = []
	I1218 00:37:35.720274 1195787 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:37:35.720279 1195787 command_runner.go:130] > # no_sync_log = false
	I1218 00:37:35.720284 1195787 command_runner.go:130] > # default_annotations = {}
	I1218 00:37:35.720288 1195787 command_runner.go:130] > # stream_websockets = false
	I1218 00:37:35.720292 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.720348 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.720360 1195787 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:37:35.720367 1195787 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:37:35.720386 1195787 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:37:35.720399 1195787 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:37:35.720403 1195787 command_runner.go:130] > #   in $PATH.
	I1218 00:37:35.720418 1195787 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:37:35.720433 1195787 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:37:35.720439 1195787 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:37:35.720444 1195787 command_runner.go:130] > #   state.
	I1218 00:37:35.720451 1195787 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:37:35.720460 1195787 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:37:35.720466 1195787 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:37:35.720473 1195787 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:37:35.720480 1195787 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:37:35.720496 1195787 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:37:35.720506 1195787 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:37:35.720513 1195787 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:37:35.720531 1195787 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:37:35.720543 1195787 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:37:35.720550 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:37:35.720566 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:37:35.720576 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:37:35.720582 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:37:35.720590 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:37:35.720628 1195787 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:37:35.720649 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:37:35.720665 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:37:35.720671 1195787 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:37:35.720679 1195787 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:37:35.720689 1195787 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:37:35.720706 1195787 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:37:35.720719 1195787 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:37:35.720733 1195787 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:37:35.720746 1195787 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:37:35.720754 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:37:35.720760 1195787 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:37:35.720764 1195787 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:37:35.720772 1195787 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:37:35.720777 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:37:35.720783 1195787 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:37:35.720795 1195787 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:37:35.720813 1195787 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:37:35.720825 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:37:35.720833 1195787 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:37:35.720849 1195787 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:37:35.720857 1195787 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:37:35.720865 1195787 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:37:35.720875 1195787 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:37:35.720882 1195787 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:37:35.720888 1195787 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:37:35.720897 1195787 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:37:35.720905 1195787 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:37:35.720932 1195787 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:37:35.720946 1195787 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:37:35.720960 1195787 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:37:35.720968 1195787 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:37:35.720975 1195787 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:37:35.720983 1195787 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:37:35.720995 1195787 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:37:35.721013 1195787 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:37:35.721023 1195787 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:37:35.721031 1195787 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:37:35.721033 1195787 command_runner.go:130] > #
	I1218 00:37:35.721038 1195787 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:37:35.721043 1195787 command_runner.go:130] > #
	I1218 00:37:35.721049 1195787 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:37:35.721058 1195787 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:37:35.721061 1195787 command_runner.go:130] > #
	I1218 00:37:35.721072 1195787 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:37:35.721082 1195787 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:37:35.721085 1195787 command_runner.go:130] > #
	I1218 00:37:35.721091 1195787 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:37:35.721097 1195787 command_runner.go:130] > # feature.
	I1218 00:37:35.721100 1195787 command_runner.go:130] > #
	I1218 00:37:35.721106 1195787 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:37:35.721112 1195787 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:37:35.721119 1195787 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:37:35.721125 1195787 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:37:35.721131 1195787 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:37:35.721141 1195787 command_runner.go:130] > #
	I1218 00:37:35.721147 1195787 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:37:35.721153 1195787 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:37:35.721158 1195787 command_runner.go:130] > #
	I1218 00:37:35.721164 1195787 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:37:35.721170 1195787 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:37:35.721177 1195787 command_runner.go:130] > #
	I1218 00:37:35.721183 1195787 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:37:35.721188 1195787 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:37:35.721192 1195787 command_runner.go:130] > # limitation.
	I1218 00:37:35.721196 1195787 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:37:35.721200 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:37:35.721204 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721215 1195787 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:37:35.721228 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721232 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721236 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721241 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721248 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721251 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721255 1195787 command_runner.go:130] > allowed_annotations = [
	I1218 00:37:35.721261 1195787 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:37:35.721265 1195787 command_runner.go:130] > ]
	I1218 00:37:35.721270 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721274 1195787 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:37:35.721279 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:37:35.721282 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721289 1195787 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:37:35.721293 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721307 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721312 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721316 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721320 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721325 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721331 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721339 1195787 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:37:35.721347 1195787 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:37:35.721353 1195787 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:37:35.721361 1195787 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:37:35.721384 1195787 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:37:35.721399 1195787 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:37:35.721406 1195787 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:37:35.721417 1195787 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:37:35.721427 1195787 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:37:35.721438 1195787 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:37:35.721444 1195787 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:37:35.721457 1195787 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:37:35.721461 1195787 command_runner.go:130] > # Example:
	I1218 00:37:35.721466 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:37:35.721472 1195787 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:37:35.721477 1195787 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:37:35.721487 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:37:35.721490 1195787 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:37:35.721494 1195787 command_runner.go:130] > # cpushares = "5"
	I1218 00:37:35.721498 1195787 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:37:35.721502 1195787 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:37:35.721507 1195787 command_runner.go:130] > # cpulimit = "35"
	I1218 00:37:35.721510 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.721516 1195787 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:37:35.721524 1195787 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:37:35.721529 1195787 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:37:35.721535 1195787 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:37:35.721544 1195787 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:37:35.721552 1195787 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:37:35.721556 1195787 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:37:35.721563 1195787 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:37:35.721568 1195787 command_runner.go:130] > # Default value is set to true
	I1218 00:37:35.721574 1195787 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:37:35.721580 1195787 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:37:35.721588 1195787 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:37:35.721592 1195787 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:37:35.721621 1195787 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:37:35.721627 1195787 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:37:35.721635 1195787 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:37:35.721640 1195787 command_runner.go:130] > # timezone = ""
	I1218 00:37:35.721647 1195787 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:37:35.721650 1195787 command_runner.go:130] > #
	I1218 00:37:35.721656 1195787 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:37:35.721665 1195787 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:37:35.721672 1195787 command_runner.go:130] > [crio.image]
	I1218 00:37:35.721679 1195787 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:37:35.721683 1195787 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:37:35.721689 1195787 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:37:35.721701 1195787 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721706 1195787 command_runner.go:130] > # global_auth_file = ""
	I1218 00:37:35.721711 1195787 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:37:35.721723 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721728 1195787 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.721738 1195787 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:37:35.721745 1195787 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721754 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721758 1195787 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:37:35.721764 1195787 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:37:35.721769 1195787 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:37:35.721776 1195787 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:37:35.721781 1195787 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:37:35.721787 1195787 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:37:35.721793 1195787 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:37:35.721799 1195787 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:37:35.721805 1195787 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:37:35.721813 1195787 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:37:35.721819 1195787 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:37:35.721825 1195787 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:37:35.721831 1195787 command_runner.go:130] > # pinned_images = [
	I1218 00:37:35.721834 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721840 1195787 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:37:35.721846 1195787 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:37:35.721853 1195787 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:37:35.721859 1195787 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:37:35.721866 1195787 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:37:35.721871 1195787 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:37:35.721879 1195787 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:37:35.721892 1195787 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:37:35.721901 1195787 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:37:35.721912 1195787 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:37:35.721918 1195787 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:37:35.721923 1195787 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:37:35.721928 1195787 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:37:35.721935 1195787 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:37:35.721938 1195787 command_runner.go:130] > # changing them here.
	I1218 00:37:35.721944 1195787 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:37:35.721955 1195787 command_runner.go:130] > # insecure_registries = [
	I1218 00:37:35.721957 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721964 1195787 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:37:35.721969 1195787 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:37:35.721977 1195787 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:37:35.721983 1195787 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:37:35.721987 1195787 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:37:35.721998 1195787 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:37:35.722005 1195787 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:37:35.722009 1195787 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:37:35.722015 1195787 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:37:35.722024 1195787 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:37:35.722031 1195787 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:37:35.722036 1195787 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:37:35.722048 1195787 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:37:35.722054 1195787 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:37:35.722062 1195787 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:37:35.722070 1195787 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:37:35.722074 1195787 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:37:35.722081 1195787 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:37:35.722087 1195787 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:37:35.722091 1195787 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:37:35.722097 1195787 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:37:35.722108 1195787 command_runner.go:130] > # CNI plugins.
	I1218 00:37:35.722117 1195787 command_runner.go:130] > [crio.network]
	I1218 00:37:35.722131 1195787 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:37:35.722136 1195787 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:37:35.722142 1195787 command_runner.go:130] > # cni_default_network = ""
	I1218 00:37:35.722148 1195787 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:37:35.722156 1195787 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:37:35.722162 1195787 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:37:35.722165 1195787 command_runner.go:130] > # plugin_dirs = [
	I1218 00:37:35.722169 1195787 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:37:35.722172 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722176 1195787 command_runner.go:130] > # List of included pod metrics.
	I1218 00:37:35.722180 1195787 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:37:35.722182 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722190 1195787 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:37:35.722196 1195787 command_runner.go:130] > [crio.metrics]
	I1218 00:37:35.722201 1195787 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:37:35.722205 1195787 command_runner.go:130] > # enable_metrics = false
	I1218 00:37:35.722209 1195787 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:37:35.722215 1195787 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:37:35.722222 1195787 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:37:35.722233 1195787 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:37:35.722239 1195787 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:37:35.722243 1195787 command_runner.go:130] > # metrics_collectors = [
	I1218 00:37:35.722247 1195787 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:37:35.722252 1195787 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:37:35.722256 1195787 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:37:35.722260 1195787 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:37:35.722266 1195787 command_runner.go:130] > # 	"operations_total",
	I1218 00:37:35.722270 1195787 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:37:35.722275 1195787 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:37:35.722279 1195787 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:37:35.722283 1195787 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:37:35.722287 1195787 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:37:35.722295 1195787 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:37:35.722299 1195787 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:37:35.722312 1195787 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:37:35.722316 1195787 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:37:35.722321 1195787 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:37:35.722325 1195787 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:37:35.722329 1195787 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:37:35.722332 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722338 1195787 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:37:35.722342 1195787 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:37:35.722347 1195787 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:37:35.722351 1195787 command_runner.go:130] > # metrics_port = 9090
	I1218 00:37:35.722358 1195787 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:37:35.722362 1195787 command_runner.go:130] > # metrics_socket = ""
	I1218 00:37:35.722377 1195787 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:37:35.722386 1195787 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:37:35.722398 1195787 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:37:35.722403 1195787 command_runner.go:130] > # certificate on any modification event.
	I1218 00:37:35.722406 1195787 command_runner.go:130] > # metrics_cert = ""
	I1218 00:37:35.722411 1195787 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:37:35.722421 1195787 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:37:35.722424 1195787 command_runner.go:130] > # metrics_key = ""
	I1218 00:37:35.722433 1195787 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:37:35.722437 1195787 command_runner.go:130] > [crio.tracing]
	I1218 00:37:35.722445 1195787 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:37:35.722451 1195787 command_runner.go:130] > # enable_tracing = false
	I1218 00:37:35.722464 1195787 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:37:35.722472 1195787 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:37:35.722479 1195787 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:37:35.722485 1195787 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:37:35.722490 1195787 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:37:35.722493 1195787 command_runner.go:130] > [crio.nri]
	I1218 00:37:35.722498 1195787 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:37:35.722507 1195787 command_runner.go:130] > # enable_nri = true
	I1218 00:37:35.722519 1195787 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:37:35.722524 1195787 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:37:35.722528 1195787 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:37:35.722539 1195787 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:37:35.722544 1195787 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:37:35.722549 1195787 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:37:35.722557 1195787 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:37:35.722613 1195787 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:37:35.722623 1195787 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:37:35.722628 1195787 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:37:35.722634 1195787 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:37:35.722640 1195787 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:37:35.722645 1195787 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:37:35.722651 1195787 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:37:35.722658 1195787 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:37:35.722663 1195787 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:37:35.722666 1195787 command_runner.go:130] > # - OCI hook injection
	I1218 00:37:35.722671 1195787 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:37:35.722677 1195787 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:37:35.722683 1195787 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:37:35.722689 1195787 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:37:35.722696 1195787 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:37:35.722702 1195787 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:37:35.722709 1195787 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:37:35.722712 1195787 command_runner.go:130] > #
	I1218 00:37:35.722717 1195787 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:37:35.722724 1195787 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:37:35.722729 1195787 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:37:35.722734 1195787 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:37:35.722739 1195787 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:37:35.722744 1195787 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:37:35.722749 1195787 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:37:35.722759 1195787 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:37:35.722765 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722771 1195787 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:37:35.722777 1195787 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:37:35.722788 1195787 command_runner.go:130] > [crio.stats]
	I1218 00:37:35.722797 1195787 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:37:35.722805 1195787 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:37:35.722809 1195787 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:37:35.722814 1195787 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:37:35.722821 1195787 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:37:35.722825 1195787 command_runner.go:130] > # collection_period = 0
	I1218 00:37:35.722870 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686277403Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:37:35.722885 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686455769Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:37:35.722906 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686635242Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:37:35.722915 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686725939Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:37:35.722930 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686860827Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.722940 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.687143526Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:37:35.722954 1195787 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:37:35.723070 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:35.723084 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:35.723105 1195787 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:37:35.723135 1195787 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath
:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:37:35.723264 1195787 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:37:35.723342 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:37:35.730799 1195787 command_runner.go:130] > kubeadm
	I1218 00:37:35.730815 1195787 command_runner.go:130] > kubectl
	I1218 00:37:35.730820 1195787 command_runner.go:130] > kubelet
	I1218 00:37:35.730852 1195787 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:37:35.730903 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:37:35.737892 1195787 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:37:35.749699 1195787 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:37:35.761635 1195787 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2219 bytes)
	I1218 00:37:35.773650 1195787 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:37:35.777155 1195787 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:37:35.777265 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.913809 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:36.641224 1195787 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:37:36.641246 1195787 certs.go:195] generating shared ca certs ...
	I1218 00:37:36.641263 1195787 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:36.641410 1195787 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:37:36.641464 1195787 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:37:36.641475 1195787 certs.go:257] generating profile certs ...
	I1218 00:37:36.641577 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:37:36.641667 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:37:36.641711 1195787 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:37:36.641724 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:37:36.641737 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:37:36.641753 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:37:36.641763 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:37:36.641780 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:37:36.641792 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:37:36.641807 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:37:36.641818 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:37:36.641873 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:37:36.641907 1195787 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:37:36.641920 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:37:36.641952 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:37:36.641982 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:37:36.642014 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:37:36.642068 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:36.642106 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:37:36.642122 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.642133 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.642704 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:37:36.662928 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:37:36.685489 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:37:36.708038 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:37:36.726679 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:37:36.744109 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:37:36.760724 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:37:36.777802 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:37:36.794736 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:37:36.811089 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:37:36.827838 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:37:36.844718 1195787 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:37:36.856626 1195787 ssh_runner.go:195] Run: openssl version
	I1218 00:37:36.862122 1195787 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:37:36.862595 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.869813 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:37:36.876968 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880287 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880319 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880364 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.920445 1195787 command_runner.go:130] > b5213941
	I1218 00:37:36.920887 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:37:36.928015 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.934857 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:37:36.941992 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945456 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945522 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945583 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.985712 1195787 command_runner.go:130] > 51391683
	I1218 00:37:36.986191 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:37:36.993294 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.001803 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:37:37.011590 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.016819 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017267 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017348 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.061113 1195787 command_runner.go:130] > 3ec20f2e
	I1218 00:37:37.061606 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:37:37.068668 1195787 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072025 1195787 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072050 1195787 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:37:37.072057 1195787 command_runner.go:130] > Device: 259,1	Inode: 1326178     Links: 1
	I1218 00:37:37.072063 1195787 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:37.072070 1195787 command_runner.go:130] > Access: 2025-12-18 00:33:28.828061434 +0000
	I1218 00:37:37.072075 1195787 command_runner.go:130] > Modify: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072080 1195787 command_runner.go:130] > Change: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072086 1195787 command_runner.go:130] >  Birth: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072155 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:37:37.111978 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.112489 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:37:37.152999 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.153074 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:37:37.194884 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.195292 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:37:37.235218 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.235658 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:37:37.275710 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.276177 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:37:37.316082 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.316486 1195787 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwa
rePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:37.316593 1195787 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:37:37.316685 1195787 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:37:37.341722 1195787 cri.go:89] found id: ""
	I1218 00:37:37.341828 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:37:37.348335 1195787 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:37:37.348357 1195787 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:37:37.348372 1195787 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:37:37.349183 1195787 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:37:37.349197 1195787 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:37:37.349253 1195787 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:37:37.356307 1195787 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:37:37.356734 1195787 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.356836 1195787 kubeconfig.go:62] /home/jenkins/minikube-integration/22186-1156339/kubeconfig needs updating (will repair): [kubeconfig missing "functional-288604" cluster setting kubeconfig missing "functional-288604" context setting]
	I1218 00:37:37.357097 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.357514 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.357675 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.358178 1195787 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:37:37.358185 1195787 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:37:37.358343 1195787 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:37:37.358365 1195787 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:37:37.358389 1195787 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:37:37.358400 1195787 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:37:37.358747 1195787 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:37:37.366250 1195787 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:37:37.366287 1195787 kubeadm.go:602] duration metric: took 17.084351ms to restartPrimaryControlPlane
	I1218 00:37:37.366297 1195787 kubeadm.go:403] duration metric: took 49.819997ms to StartCluster
	I1218 00:37:37.366310 1195787 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.366369 1195787 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.366947 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.367145 1195787 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:37:37.367532 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:37.367580 1195787 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:37:37.367705 1195787 addons.go:70] Setting storage-provisioner=true in profile "functional-288604"
	I1218 00:37:37.367724 1195787 addons.go:239] Setting addon storage-provisioner=true in "functional-288604"
	I1218 00:37:37.367744 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.368436 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.368583 1195787 addons.go:70] Setting default-storageclass=true in profile "functional-288604"
	I1218 00:37:37.368601 1195787 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-288604"
	I1218 00:37:37.368944 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.373199 1195787 out.go:179] * Verifying Kubernetes components...
	I1218 00:37:37.376080 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:37.397822 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.397983 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.398246 1195787 addons.go:239] Setting addon default-storageclass=true in "functional-288604"
	I1218 00:37:37.398278 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.398894 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.407451 1195787 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:37:37.410300 1195787 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.410322 1195787 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:37:37.410384 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.434096 1195787 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:37.434117 1195787 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:37:37.434174 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.457842 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.477819 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.583963 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:37.618382 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.637024 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.392142 1195787 node_ready.go:35] waiting up to 6m0s for node "functional-288604" to be "Ready" ...
	I1218 00:37:38.392289 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.392356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.392602 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392638 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392662 1195787 retry.go:31] will retry after 293.380468ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392710 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392727 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392733 1195787 retry.go:31] will retry after 283.333163ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:38.676355 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.686660 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:38.750557 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753745 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753775 1195787 retry.go:31] will retry after 508.906429ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753840 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753899 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753916 1195787 retry.go:31] will retry after 283.918132ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.893115 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.893199 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.893535 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.038817 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.092066 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.095485 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.095518 1195787 retry.go:31] will retry after 317.14343ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.262906 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.318327 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.322166 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.322196 1195787 retry.go:31] will retry after 611.398612ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.392378 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.413200 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.474250 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.474288 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.474326 1195787 retry.go:31] will retry after 551.991324ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.892368 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.892757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.933930 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.991113 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.991153 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.991172 1195787 retry.go:31] will retry after 590.272449ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.027415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:40.085906 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.089482 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.089515 1195787 retry.go:31] will retry after 1.798316027s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.392931 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.393007 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.393310 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:40.393376 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:40.582668 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:40.643859 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.643900 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.643941 1195787 retry.go:31] will retry after 1.196819353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.892768 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.392495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.841577 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:41.888099 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:41.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.901267 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.901306 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.901323 1195787 retry.go:31] will retry after 1.106575841s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948402 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.948447 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948500 1195787 retry.go:31] will retry after 1.314106681s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:42.393054 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.393195 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.393477 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:42.393524 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:42.893249 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.893594 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.008894 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:43.066157 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.066194 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.066212 1195787 retry.go:31] will retry after 2.952953914s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.263490 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:43.325047 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.325147 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.325201 1195787 retry.go:31] will retry after 2.165088511s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.892529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.892853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.392698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.892859 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:44.892927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:45.392615 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.393055 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:45.491313 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:45.548834 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:45.552259 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.552290 1195787 retry.go:31] will retry after 4.009218302s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.020180 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:46.081331 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:46.081373 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.081392 1195787 retry.go:31] will retry after 2.724964309s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.392810 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.392886 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.393216 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.893121 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.893451 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:46.893527 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:47.392312 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.392379 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.392690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:47.892435 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.892508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.806570 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:48.859925 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:48.863450 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.863523 1195787 retry.go:31] will retry after 5.125713123s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.892640 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.892710 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.892972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:49.392419 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.392858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:49.392930 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:49.562244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:49.616549 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:49.619912 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.619976 1195787 retry.go:31] will retry after 7.525324152s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.893380 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.893476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.893792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.392521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.892413 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.392501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:51.892896 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:52.392394 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:52.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.892886 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.892492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.990244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:54.052144 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:54.052189 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.052212 1195787 retry.go:31] will retry after 10.028215297s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:54.392879 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:54.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.892892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.892535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:56.892936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:57.146448 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:57.223873 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:57.223911 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.223929 1195787 retry.go:31] will retry after 7.68443688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.392441 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.392757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:57.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.892896 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.892496 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.892902 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:58.892976 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:59.392373 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.392729 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:59.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.392605 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.392823 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.393374 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.893180 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.893259 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.893590 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:00.893634 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:01.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.392775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:01.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.892560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.392629 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.393091 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.893045 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.893120 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.893431 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:03.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.393360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.393682 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:03.393752 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:03.892452 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.892588 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.081415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:04.149098 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.149154 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.149173 1195787 retry.go:31] will retry after 12.181474759s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.392826 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.892706 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.908952 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:04.983582 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.983679 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.983707 1195787 retry.go:31] will retry after 20.674508131s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:05.393152 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.393222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.393548 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:05.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:05.892840 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:06.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:06.892466 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.892581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.392796 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.392870 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.393185 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.893008 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.893099 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.893411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:07.893460 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:08.393200 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.393269 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.393580 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:08.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.392350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:10.392470 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.392542 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:10.392885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:10.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.392567 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.392748 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.392724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.892526 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.892600 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.892927 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:12.892994 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:13.392672 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.393083 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:13.892350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.892423 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:15.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.392797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:15.392868 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:15.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.331590 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:16.385966 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:16.389831 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.389871 1195787 retry.go:31] will retry after 10.81475415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.393176 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.393493 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.893314 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.893409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.893794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:17.392528 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.392670 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:17.393070 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:17.892892 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.892977 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.893319 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.393167 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.393296 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:19.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.392531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.393011 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:19.393093 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:19.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.892493 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.392457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.392540 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.892404 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.892505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.892840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.392336 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.392752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:21.892899 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:22.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.392824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:22.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.892650 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.892812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:24.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.392459 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.392739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:24.392822 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:24.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.392505 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.392578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.392919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.658449 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:25.718689 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:25.718787 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.718811 1195787 retry.go:31] will retry after 20.411460434s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.893032 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.893152 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.893496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:26.393268 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.393345 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.393658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:26.393735 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:26.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.205308 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:27.264744 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:27.264795 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.264821 1195787 retry.go:31] will retry after 26.872581906s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.393247 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.393343 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.393691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.392400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.392499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.892880 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:28.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:29.392435 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.392530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:29.892518 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.892615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.892959 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.392483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.892684 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:30.893088 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:31.392443 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.392538 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:31.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.392398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.892495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:33.392363 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.392786 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:33.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:33.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.892942 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:35.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:35.392919 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:35.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.892552 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.892909 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:37.392676 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.392747 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.393087 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:37.393163 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:37.892815 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.892918 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.893191 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.392972 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.393069 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.393395 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.893206 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.893283 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.893614 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.892914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:39.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:40.392478 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:40.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.392429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.892557 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.892907 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:42.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.392468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:42.392895 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.892798 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.893169 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.392965 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.393041 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.393425 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.893065 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.893192 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.893542 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:44.393319 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.393457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.393797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:44.393864 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:44.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.892887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.392407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.392691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.892831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.131350 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:46.207148 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:46.207192 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.207211 1195787 retry.go:31] will retry after 46.493082425s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.392632 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.393042 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.892356 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.892439 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:46.892784 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:47.392561 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.392655 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.393022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:47.892957 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.893052 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.393028 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.393151 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.393502 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.893231 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.893339 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:48.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:49.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.892410 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.892719 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.392567 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.392922 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.892639 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.892718 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.893017 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:51.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:51.392927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:51.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.392587 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.392689 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.393078 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.892911 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.893278 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:53.393104 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.393175 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.393506 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:53.393578 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:53.893174 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.893253 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.893635 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.138097 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:54.199604 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:54.199639 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.199657 1195787 retry.go:31] will retry after 32.999586692s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.392915 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.392997 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.393320 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.893151 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.893222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.893558 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:55.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.393372 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.393696 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:55.393771 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:55.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.392482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.392536 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:57.892898 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:58.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.392510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.392842 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.892427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.892690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.392771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:00.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.392852 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:00.392906 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:00.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.392554 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.392642 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.392932 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:02.392460 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:02.392937 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:02.893082 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.893161 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.893517 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.393213 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.393335 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.393710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.893298 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.893393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.893695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.392345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.392774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.892322 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.892394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:04.892799 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:05.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:05.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.892687 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.893007 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.392320 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.392389 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.392734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:06.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:07.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.392661 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.393015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:07.892841 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.892966 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.893308 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.393076 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.393143 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.393465 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.893245 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.893642 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:08.893706 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:09.392340 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:09.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.392603 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.393041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.892396 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.892674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:11.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.392788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:11.392854 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:11.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.892864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.392694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:13.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.392558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:13.392904 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:13.892358 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.392462 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.892493 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.892568 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.892916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.392728 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:15.892902 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:16.392592 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.393051 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:16.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.892766 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.392504 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.392612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.392914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.892926 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.893001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:17.893380 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:18.393186 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.393287 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.393589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:18.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.892436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.892724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.892501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:20.392577 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.392677 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:20.393034 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:20.892700 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.892780 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.893096 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.392884 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.392972 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.393246 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.893036 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.893115 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.893439 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:22.393105 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.393180 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.393531 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:22.393602 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:22.893283 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.893357 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.893654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.392360 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.392759 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.892410 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.892763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.392324 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.392671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.892367 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:24.892851 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:25.392405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.392832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:25.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.892598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:26.893059 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:27.199423 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:39:27.258871 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262674 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262814 1195787 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:27.393144 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.393338 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.393739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:27.892445 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.892515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.392686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.393001 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.892388 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:29.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:29.392759 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:29.892449 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.892575 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.892898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.892522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:31.392593 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.392698 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.393057 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:31.393175 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.892746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.393076 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.700811 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:39:32.759422 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759519 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759610 1195787 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:32.762569 1195787 out.go:179] * Enabled addons: 
	I1218 00:39:32.766452 1195787 addons.go:530] duration metric: took 1m55.398865574s for enable addons: enabled=[]
	I1218 00:39:32.892720 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.892834 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.893134 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:33.392885 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.392951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.393266 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:33.393360 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:33.893120 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.893193 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.893559 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.393379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.393480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.393839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.892868 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.392614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.892537 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:35.892942 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:36.392383 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.392802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:36.892719 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.892802 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.893187 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.393130 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.393210 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.393536 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.892374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:38.392658 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.392729 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:38.393158 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:38.892929 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.893353 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.393129 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.393197 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.393452 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.893242 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.893317 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.893673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.392517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.892369 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.892525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:40.892938 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:41.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.392798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:41.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.892875 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.392446 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.892629 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.892701 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.893027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:42.893091 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:43.392773 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.392853 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.393188 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:43.892986 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.893071 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.893334 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.393187 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.393481 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.893282 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.893350 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.893683 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:44.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:45.392327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.392700 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:45.892412 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.392830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.892694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:47.392541 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:47.392943 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:47.892913 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.892990 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.392935 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.393001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.393289 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.893086 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.893157 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.893470 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:49.393330 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.393420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.393740 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:49.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:49.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.892391 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.892665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.392473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.892447 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.392762 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.892406 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:51.892823 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:52.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:52.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.392422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.392736 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.892489 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.892612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:53.893004 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:54.392376 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:54.892366 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.892778 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:56.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:56.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:56.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.392473 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.892694 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.892772 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.892739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:58.892789 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:59.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.392532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:59.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.892633 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.392899 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.892419 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:00.892885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:01.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:01.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.892996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.392436 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.392515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.892688 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.892766 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:02.893136 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:03.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:03.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.392791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.892671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:05.392381 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:05.392855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:05.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:07.392557 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.392630 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.392948 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:07.393044 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:07.892947 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.893292 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.393116 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.393189 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.393503 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.893257 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.893330 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.893657 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:09.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.393365 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.393623 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:09.393667 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:09.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.892429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.392530 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.892422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.892749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.892544 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.892620 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:11.893010 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:12.392339 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.392715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:12.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.892814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:14.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.392813 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:14.392867 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:14.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.392448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.892562 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.892645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.892993 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:16.392714 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.392789 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.393108 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:16.393181 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:16.892981 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.893057 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.893318 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.393280 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.393362 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.393715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.392754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.892443 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.892891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:18.892951 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:19.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.392715 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.393048 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:19.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.392464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.392845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:21.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:21.392721 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:21.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.392527 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:23.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:23.392882 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:23.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.392684 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:25.392518 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.392622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.393004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:25.393058 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:25.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.892660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.392355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.892589 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.392982 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.892845 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.892928 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.893247 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:27.893296 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:28.392780 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.392854 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.393198 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:28.892980 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.893051 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.893305 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.393025 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.393097 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.393406 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.893167 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.893246 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.893569 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:29.893625 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:30.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.392419 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.392749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:30.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.892844 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.392817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.892407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:32.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.392882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:32.392936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:32.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.892506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.392470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.892473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:34.892850 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:35.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.392811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:35.892523 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.892622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.892978 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.392672 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.892754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.392604 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:37.393019 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:37.892964 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.893061 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.893356 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.393199 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.393280 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.393633 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.892784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.392710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.892802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:39.892855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:40.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:40.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.892649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.892534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:42.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.392664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:42.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.892597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.392651 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.392720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.393047 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.892468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.892851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:44.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.392461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:44.392853 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:44.892484 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.892559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.892919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.392680 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.892767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.892679 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:46.892729 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:47.392534 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.392616 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:47.893009 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.893086 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.893414 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.393173 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.393243 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.393496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.893310 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.893387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.893701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:48.893755 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:49.392365 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.392767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:49.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.892412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.392433 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:51.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.392457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:51.392745 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:51.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.392597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.392916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.892836 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.892908 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.893174 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:53.393008 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.393079 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.393392 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:53.393445 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:53.893170 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.893250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.893577 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.393256 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.393323 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.393624 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.892833 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.892909 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.893230 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.392807 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.392879 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.393195 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.892903 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.892983 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.893248 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:55.893295 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:56.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.393164 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.393494 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:56.893233 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.893303 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.893625 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.392308 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.892561 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.892640 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.893004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:58.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.392770 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:58.392830 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:58.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.392417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.392375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.392732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:00.892926 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:01.392600 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.392680 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.393027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:01.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.392459 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.392534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.392891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.892379 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:03.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.392673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:03.392717 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:03.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.892520 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.892803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:05.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:05.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:05.892501 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.892578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.892871 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.392458 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.392846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.892564 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.892651 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:07.392849 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.392927 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.393249 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:07.393300 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:07.893061 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.893141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.893407 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.393571 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.893213 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.893285 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.893586 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:09.393338 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.393409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.393737 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:09.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:09.892310 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.892384 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.892460 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.892531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.392821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.892452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:11.892845 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:12.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.392447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:12.892659 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.893041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.392855 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.892343 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:14.392354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.392431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:14.392815 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:14.892477 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.892893 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:16.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.392508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:16.392886 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:16.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.392663 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.393122 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.893034 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.893112 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.893434 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:18.393225 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.393298 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.393566 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:18.393619 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:18.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.892776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.392463 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.392545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:20.892875 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:21.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.392697 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:21.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.392426 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.892691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:23.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:23.392897 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:23.892549 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.892621 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.892930 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.392371 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:25.392487 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.392571 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.392908 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:25.392969 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:25.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.892701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.392465 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.392539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.892616 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.892694 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.892997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:27.893042 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:28.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:28.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.392438 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.892380 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:30.392322 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:30.392775 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:30.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.392484 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.392563 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.392894 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:32.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:32.392859 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:32.892645 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.892720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.893273 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.393053 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.393128 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.393383 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.893197 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.893271 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.893602 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.392329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.892409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:34.892764 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:35.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:35.892454 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.392716 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:36.892870 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:37.392455 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.392936 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:37.893011 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.893129 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.893427 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.393291 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.393628 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.893349 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.893718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:38.893795 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:39.393265 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.393337 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.393588 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:39.893331 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.893408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.893734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.892710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:41.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:41.392842 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:41.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.892453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.892740 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.893071 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:43.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.392714 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.393030 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:43.393087 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:43.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.892423 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.892741 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:45.892958 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:46.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:46.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.892695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.392510 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.392586 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.392951 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:48.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.392712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:48.392768 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:48.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.392787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.892646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:50.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:50.392909 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:50.892579 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.892652 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.892985 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.392709 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.392792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.892354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:52.892725 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:53.392414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.392831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:53.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:54.892908 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:55.392584 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.392665 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.392996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:55.892326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.892800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.392401 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.392474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.892944 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:56.892999 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:57.392855 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.392925 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:57.893155 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.893229 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.893537 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.393357 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.393431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.393713 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:59.392495 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.392587 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.392917 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:59.392973 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:59.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.893052 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:01.392764 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.392860 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:01.393227 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:01.892948 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.893270 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.393163 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.393444 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.892430 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.892742 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.892755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:03.892801 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:04.392448 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.392523 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:04.892336 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.892677 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:05.892848 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:06.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:06.892407 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.892929 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.392690 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.392765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.393085 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.893075 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.893147 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:07.893442 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:08.393253 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.393325 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.393644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:08.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.392738 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:10.392423 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:10.392894 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:10.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.392445 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.392851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.892553 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.892632 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.892973 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.392666 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:12.892876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:13.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.392815 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:13.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.392377 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:14.892953 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:15.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.392701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:15.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.392442 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.392833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.892667 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:17.392522 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.392590 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:17.392921 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:17.892652 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.892728 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.893037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.892817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.392472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.892338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:19.892782 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:20.392466 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:20.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:21.892849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:22.392532 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.392615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.392957 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:22.892669 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.892988 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.392497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.892580 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.892912 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:23.892972 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:24.392362 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.392433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.392780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:24.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.892471 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.392469 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.392543 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:26.392450 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.392522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.392876 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:26.392935 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:26.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.892686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.392919 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.392992 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.393253 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.893175 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.893249 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.893584 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.392338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.892451 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:28.892862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:29.392512 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.392870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:29.892578 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.892656 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.392314 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.893263 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.893356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.893732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:30.893797 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:31.392476 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.392560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:31.892342 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.892698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.392782 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:33.392366 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.392818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:33.392876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:33.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.892447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.392507 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.392934 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.892413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:35.392488 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:35.392932 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:35.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.892504 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.392359 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.392660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.393104 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:37.393155 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:37.892886 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.892951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.893194 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.392988 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.393067 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.393390 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.893201 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.893273 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.893589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:39.393344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.393415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.393662 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:39.393702 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:39.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.392451 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.392529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.392862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.892870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:42.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:42.892516 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.892611 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.892938 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.392638 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.392711 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.393037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.892699 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.892765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:43.893055 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:44.392565 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.392645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.392956 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:44.892656 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.892731 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.893058 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.392581 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.392846 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.393290 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.893123 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.893447 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:45.893502 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:46.393297 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.393390 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.393780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:46.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.892799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.392560 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.392631 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.392960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.892485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:48.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.392663 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:48.392703 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:48.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.392544 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.392943 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:50.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:50.392849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:50.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.392654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.892315 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.892388 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.892718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:52.392427 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:52.392889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:52.892324 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.892546 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.892874 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.392387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.892485 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.892882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:54.892940 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:55.392636 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.393066 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:55.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:57.392739 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.392818 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.393074 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:57.393125 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:57.892970 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.893044 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.893385 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.393560 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.893294 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.893360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.893621 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.892745 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:59.892790 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:00.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.392421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:00.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.892467 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.392584 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:02.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.392819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:02.392871 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:02.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.392644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.392425 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.892634 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:04.892673 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:05.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:05.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.892809 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.392658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:06.892843 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:07.392572 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.392646 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:07.892874 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.892944 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.893199 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.393002 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.393080 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.393411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.893209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.893286 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.893674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:08.893724 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:09.392325 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.392655 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:09.892370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.392431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:11.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:11.392883 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:11.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.892939 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.392370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.892624 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:13.392523 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.392903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:13.392963 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:13.892519 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.892431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.892510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.392321 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.892434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.892732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:15.892778 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:16.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:16.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.893341 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.893646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.392499 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.392579 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.892490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:17.892807 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:18.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:18.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.393074 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.393141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.393442 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.893090 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.893170 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.893422 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:19.893473 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:20.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.393284 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.393611 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:20.893284 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.893361 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:22.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:22.392844 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:22.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.892425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.892708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.392485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.392394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.392659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.892838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:24.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:25.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:25.892357 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.892457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.892775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.392472 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.892582 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.892678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.893022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:26.893073 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:27.392724 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.392791 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.393032 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:27.893016 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.893101 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.893433 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.393326 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.892772 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:29.392456 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.392869 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:29.392915 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:29.892602 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.892673 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.892440 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.392566 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.892417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:31.892716 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:32.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.392776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:32.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.892858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:33.392332 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.397972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1218 00:43:33.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:33.892918 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:34.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:34.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.892442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.892725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.392438 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.392511 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.892534 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.892662 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:35.893039 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:36.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:36.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.392739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.892912 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.892985 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.893251 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:37.893290 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:38.393068 1195787 node_ready.go:38] duration metric: took 6m0.000870722s for node "functional-288604" to be "Ready" ...
	I1218 00:43:38.396243 1195787 out.go:203] 
	W1218 00:43:38.399208 1195787 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:43:38.399223 1195787 out.go:285] * 
	W1218 00:43:38.401353 1195787 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:43:38.404386 1195787 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:43:47 functional-288604 crio[5385]: time="2025-12-18T00:43:47.483682926Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=459e7e34-46b9-4af3-aeb2-3828d78787c0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.544721096Z" level=info msg="Checking image status: minikube-local-cache-test:functional-288604" id=79ae8c57-cb86-4a89-9d9e-488f57f10ba4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.544915906Z" level=info msg="Resolving \"minikube-local-cache-test\" using unqualified-search registries (/etc/containers/registries.conf.d/crio.conf)"
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.54496525Z" level=info msg="Image minikube-local-cache-test:functional-288604 not found" id=79ae8c57-cb86-4a89-9d9e-488f57f10ba4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.545066023Z" level=info msg="Neither image nor artfiact minikube-local-cache-test:functional-288604 found" id=79ae8c57-cb86-4a89-9d9e-488f57f10ba4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.570709928Z" level=info msg="Checking image status: docker.io/library/minikube-local-cache-test:functional-288604" id=447cd5bb-fff0-481e-bb0d-889c82fd5d99 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.570874978Z" level=info msg="Image docker.io/library/minikube-local-cache-test:functional-288604 not found" id=447cd5bb-fff0-481e-bb0d-889c82fd5d99 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.570929262Z" level=info msg="Neither image nor artfiact docker.io/library/minikube-local-cache-test:functional-288604 found" id=447cd5bb-fff0-481e-bb0d-889c82fd5d99 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.598192827Z" level=info msg="Checking image status: localhost/library/minikube-local-cache-test:functional-288604" id=7a1a91e3-6aa3-4392-890d-10a07bdb8a7c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.598368732Z" level=info msg="Image localhost/library/minikube-local-cache-test:functional-288604 not found" id=7a1a91e3-6aa3-4392-890d-10a07bdb8a7c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.598409798Z" level=info msg="Neither image nor artfiact localhost/library/minikube-local-cache-test:functional-288604 found" id=7a1a91e3-6aa3-4392-890d-10a07bdb8a7c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.536834458Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=e9495e62-56fb-476b-9a5c-2d2a6e884dca name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.852180343Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=4ec1c8ce-835c-4c94-b05b-0569237f7941 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.852349881Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=4ec1c8ce-835c-4c94-b05b-0569237f7941 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.852397609Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=4ec1c8ce-835c-4c94-b05b-0569237f7941 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.417235866Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=8af7c980-0705-4445-a0e7-28aef17f7d2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.417374333Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=8af7c980-0705-4445-a0e7-28aef17f7d2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.417420551Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=8af7c980-0705-4445-a0e7-28aef17f7d2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.44140991Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=0ddd8dbb-f8d5-4759-8554-1d72ca750067 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.441570857Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=0ddd8dbb-f8d5-4759-8554-1d72ca750067 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.441617437Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=0ddd8dbb-f8d5-4759-8554-1d72ca750067 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.464727663Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=87e28ff9-65d3-4f82-882a-d2009be58be1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.464890079Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=87e28ff9-65d3-4f82-882a-d2009be58be1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.46494191Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=87e28ff9-65d3-4f82-882a-d2009be58be1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:51 functional-288604 crio[5385]: time="2025-12-18T00:43:51.017019016Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=e08e7f89-a2cd-41e3-9d16-4f3e39c6f546 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:43:52.512255    9410 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:52.512763    9410 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:52.514372    9410 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:52.514995    9410 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:52.516602    9410 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:43:52 up  7:26,  0 user,  load average: 0.44, 0.25, 0.59
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:43:49 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:50 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1154.
	Dec 18 00:43:50 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:50 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:50 functional-288604 kubelet[9280]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:50 functional-288604 kubelet[9280]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:50 functional-288604 kubelet[9280]: E1218 00:43:50.697596    9280 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:50 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:50 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:51 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1155.
	Dec 18 00:43:51 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:51 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:51 functional-288604 kubelet[9305]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:51 functional-288604 kubelet[9305]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:51 functional-288604 kubelet[9305]: E1218 00:43:51.449120    9305 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:51 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:51 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1156.
	Dec 18 00:43:52 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:52 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:52 functional-288604 kubelet[9327]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:52 functional-288604 kubelet[9327]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:52 functional-288604 kubelet[9327]: E1218 00:43:52.193904    9327 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (322.513427ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd (2.35s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly (2.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-288604 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-288604 get pods: exit status 1 (107.135804ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-288604 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (329.915715ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 logs -n 25: (1.041699102s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                      ARGS                                                                       │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-240845 image ls --format json --alsologtostderr                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ ssh     │ functional-240845 ssh pgrep buildkitd                                                                                                           │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image   │ functional-240845 image ls --format yaml --alsologtostderr                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                          │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format table --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format short --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete  │ -p functional-240845                                                                                                                            │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start   │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ start   │ -p functional-288604 --alsologtostderr -v=8                                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:37 UTC │                     │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.1                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.3                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:latest                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add minikube-local-cache-test:functional-288604                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache delete minikube-local-cache-test:functional-288604                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ list                                                                                                                                            │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl images                                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                              │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ cache   │ functional-288604 cache reload                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ kubectl │ functional-288604 kubectl -- --context functional-288604 get pods                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:37:32
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:37:32.486183 1195787 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:37:32.486610 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486624 1195787 out.go:374] Setting ErrFile to fd 2...
	I1218 00:37:32.486629 1195787 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:37:32.486918 1195787 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:37:32.487313 1195787 out.go:368] Setting JSON to false
	I1218 00:37:32.488152 1195787 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26401,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:37:32.488255 1195787 start.go:143] virtualization:  
	I1218 00:37:32.491971 1195787 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:37:32.494842 1195787 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:37:32.494944 1195787 notify.go:221] Checking for updates...
	I1218 00:37:32.500434 1195787 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:37:32.503311 1195787 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:32.506071 1195787 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:37:32.508979 1195787 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:37:32.511873 1195787 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:37:32.515326 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:32.515476 1195787 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:37:32.549560 1195787 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:37:32.549709 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.608968 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.600331572 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.609068 1195787 docker.go:319] overlay module found
	I1218 00:37:32.612053 1195787 out.go:179] * Using the docker driver based on existing profile
	I1218 00:37:32.614859 1195787 start.go:309] selected driver: docker
	I1218 00:37:32.614879 1195787 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.614985 1195787 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:37:32.615081 1195787 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:37:32.681718 1195787 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:37:32.67244891 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:37:32.682130 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:32.682189 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:32.682255 1195787 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:32.687138 1195787 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:37:32.690134 1195787 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:37:32.693078 1195787 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:37:32.696069 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:32.696123 1195787 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:37:32.696143 1195787 cache.go:65] Caching tarball of preloaded images
	I1218 00:37:32.696183 1195787 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:37:32.696303 1195787 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:37:32.696317 1195787 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:37:32.696417 1195787 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:37:32.714975 1195787 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:37:32.714995 1195787 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:37:32.715013 1195787 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:37:32.715043 1195787 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:37:32.715099 1195787 start.go:364] duration metric: took 33.796µs to acquireMachinesLock for "functional-288604"
	I1218 00:37:32.715121 1195787 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:37:32.715131 1195787 fix.go:54] fixHost starting: 
	I1218 00:37:32.715395 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:32.731575 1195787 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:37:32.731606 1195787 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:37:32.734910 1195787 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:37:32.734955 1195787 machine.go:94] provisionDockerMachine start ...
	I1218 00:37:32.735034 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.751418 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.751747 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.751760 1195787 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:37:32.904326 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:32.904350 1195787 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:37:32.904413 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:32.933199 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:32.933525 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:32.933536 1195787 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:37:33.096692 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:37:33.096816 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.115124 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.115445 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.115466 1195787 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:37:33.272592 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:37:33.272617 1195787 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:37:33.272637 1195787 ubuntu.go:190] setting up certificates
	I1218 00:37:33.272647 1195787 provision.go:84] configureAuth start
	I1218 00:37:33.272712 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:33.291737 1195787 provision.go:143] copyHostCerts
	I1218 00:37:33.291803 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291863 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:37:33.291880 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:37:33.291977 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:37:33.292105 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292127 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:37:33.292137 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:37:33.292177 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:37:33.292274 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292300 1195787 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:37:33.292315 1195787 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:37:33.292347 1195787 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:37:33.292433 1195787 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:37:33.397529 1195787 provision.go:177] copyRemoteCerts
	I1218 00:37:33.397646 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:37:33.397692 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.416603 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:33.523879 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 00:37:33.523950 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:37:33.540143 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 00:37:33.540204 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:37:33.557091 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 00:37:33.557194 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:37:33.573937 1195787 provision.go:87] duration metric: took 301.27685ms to configureAuth
	I1218 00:37:33.573963 1195787 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:37:33.574138 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:33.574247 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.591351 1195787 main.go:143] libmachine: Using SSH client type: native
	I1218 00:37:33.591663 1195787 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:37:33.591676 1195787 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:37:33.932454 1195787 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:37:33.932478 1195787 machine.go:97] duration metric: took 1.197515142s to provisionDockerMachine
	I1218 00:37:33.932490 1195787 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:37:33.932503 1195787 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:37:33.932581 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:37:33.932636 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:33.953296 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.060199 1195787 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:37:34.063627 1195787 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1218 00:37:34.063655 1195787 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1218 00:37:34.063660 1195787 command_runner.go:130] > VERSION_ID="12"
	I1218 00:37:34.063664 1195787 command_runner.go:130] > VERSION="12 (bookworm)"
	I1218 00:37:34.063680 1195787 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1218 00:37:34.063684 1195787 command_runner.go:130] > ID=debian
	I1218 00:37:34.063689 1195787 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1218 00:37:34.063694 1195787 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1218 00:37:34.063700 1195787 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1218 00:37:34.063783 1195787 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:37:34.063800 1195787 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:37:34.063810 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:37:34.063871 1195787 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:37:34.063955 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:37:34.063966 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 00:37:34.064048 1195787 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:37:34.064056 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> /etc/test/nested/copy/1159552/hosts
	I1218 00:37:34.064100 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:37:34.071756 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:34.089207 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:37:34.106978 1195787 start.go:296] duration metric: took 174.472072ms for postStartSetup
	I1218 00:37:34.107054 1195787 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:37:34.107096 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.124265 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.224786 1195787 command_runner.go:130] > 12%
	I1218 00:37:34.224858 1195787 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:37:34.228879 1195787 command_runner.go:130] > 171G
	I1218 00:37:34.229324 1195787 fix.go:56] duration metric: took 1.514188493s for fixHost
	I1218 00:37:34.229353 1195787 start.go:83] releasing machines lock for "functional-288604", held for 1.514233177s
	I1218 00:37:34.229425 1195787 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:37:34.246154 1195787 ssh_runner.go:195] Run: cat /version.json
	I1218 00:37:34.246206 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.246451 1195787 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:37:34.246509 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:34.266363 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.276260 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:34.371623 1195787 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1218 00:37:34.371754 1195787 ssh_runner.go:195] Run: systemctl --version
	I1218 00:37:34.461010 1195787 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1218 00:37:34.461057 1195787 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1218 00:37:34.461077 1195787 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1218 00:37:34.461152 1195787 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:37:34.497659 1195787 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1218 00:37:34.501645 1195787 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1218 00:37:34.502005 1195787 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:37:34.502070 1195787 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:37:34.509755 1195787 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:37:34.509780 1195787 start.go:496] detecting cgroup driver to use...
	I1218 00:37:34.509811 1195787 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:37:34.509875 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:37:34.523916 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:37:34.536646 1195787 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:37:34.536736 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:37:34.551504 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:37:34.564054 1195787 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:37:34.675890 1195787 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:37:34.798642 1195787 docker.go:234] disabling docker service ...
	I1218 00:37:34.798703 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:37:34.813006 1195787 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:37:34.825087 1195787 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:37:34.942798 1195787 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:37:35.067868 1195787 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:37:35.088600 1195787 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:37:35.102366 1195787 command_runner.go:130] > runtime-endpoint: unix:///var/run/crio/crio.sock
	I1218 00:37:35.103752 1195787 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:37:35.103819 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.113147 1195787 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:37:35.113241 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.122530 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.131393 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.140799 1195787 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:37:35.148737 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.157396 1195787 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.165643 1195787 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.174650 1195787 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:37:35.181215 1195787 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1218 00:37:35.182122 1195787 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:37:35.189136 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.306446 1195787 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:37:35.483449 1195787 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:37:35.483550 1195787 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:37:35.487145 1195787 command_runner.go:130] >   File: /var/run/crio/crio.sock
	I1218 00:37:35.487172 1195787 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1218 00:37:35.487179 1195787 command_runner.go:130] > Device: 0,72	Inode: 1642        Links: 1
	I1218 00:37:35.487186 1195787 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:35.487202 1195787 command_runner.go:130] > Access: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487220 1195787 command_runner.go:130] > Modify: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487225 1195787 command_runner.go:130] > Change: 2025-12-18 00:37:35.404376213 +0000
	I1218 00:37:35.487229 1195787 command_runner.go:130] >  Birth: -
	I1218 00:37:35.487254 1195787 start.go:564] Will wait 60s for crictl version
	I1218 00:37:35.487306 1195787 ssh_runner.go:195] Run: which crictl
	I1218 00:37:35.490344 1195787 command_runner.go:130] > /usr/local/bin/crictl
	I1218 00:37:35.490683 1195787 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:37:35.512944 1195787 command_runner.go:130] > Version:  0.1.0
	I1218 00:37:35.513232 1195787 command_runner.go:130] > RuntimeName:  cri-o
	I1218 00:37:35.513363 1195787 command_runner.go:130] > RuntimeVersion:  1.34.3
	I1218 00:37:35.513391 1195787 command_runner.go:130] > RuntimeApiVersion:  v1
	I1218 00:37:35.515559 1195787 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:37:35.515677 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.541522 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.541589 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.541609 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.541630 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.541651 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.541672 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.541692 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.541720 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.541741 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.541768 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.541786 1195787 command_runner.go:130] >      static
	I1218 00:37:35.541805 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.541829 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.541856 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.541889 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.541915 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.541933 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.541952 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.541983 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.541999 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.543191 1195787 ssh_runner.go:195] Run: crio --version
	I1218 00:37:35.569029 1195787 command_runner.go:130] > crio version 1.34.3
	I1218 00:37:35.569102 1195787 command_runner.go:130] >    GitCommit:      067a88aedf5d7c658a2acb81afe82d6c3a367a52
	I1218 00:37:35.569122 1195787 command_runner.go:130] >    GitCommitDate:  2025-12-01T16:44:09Z
	I1218 00:37:35.569144 1195787 command_runner.go:130] >    GitTreeState:   dirty
	I1218 00:37:35.569164 1195787 command_runner.go:130] >    BuildDate:      1970-01-01T00:00:00Z
	I1218 00:37:35.569191 1195787 command_runner.go:130] >    GoVersion:      go1.24.6
	I1218 00:37:35.569210 1195787 command_runner.go:130] >    Compiler:       gc
	I1218 00:37:35.569239 1195787 command_runner.go:130] >    Platform:       linux/arm64
	I1218 00:37:35.569267 1195787 command_runner.go:130] >    Linkmode:       static
	I1218 00:37:35.569285 1195787 command_runner.go:130] >    BuildTags:
	I1218 00:37:35.569302 1195787 command_runner.go:130] >      static
	I1218 00:37:35.569320 1195787 command_runner.go:130] >      netgo
	I1218 00:37:35.569347 1195787 command_runner.go:130] >      osusergo
	I1218 00:37:35.569366 1195787 command_runner.go:130] >      exclude_graphdriver_btrfs
	I1218 00:37:35.569384 1195787 command_runner.go:130] >      seccomp
	I1218 00:37:35.569405 1195787 command_runner.go:130] >      apparmor
	I1218 00:37:35.569429 1195787 command_runner.go:130] >      selinux
	I1218 00:37:35.569449 1195787 command_runner.go:130] >    LDFlags:          unknown
	I1218 00:37:35.569467 1195787 command_runner.go:130] >    SeccompEnabled:   true
	I1218 00:37:35.569485 1195787 command_runner.go:130] >    AppArmorEnabled:  false
	I1218 00:37:35.575974 1195787 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:37:35.578737 1195787 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:37:35.594362 1195787 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:37:35.598161 1195787 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1218 00:37:35.598363 1195787 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFir
mwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:37:35.598485 1195787 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:37:35.598543 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.635547 1195787 command_runner.go:130] > {
	I1218 00:37:35.635578 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.635584 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635591 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.635596 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635602 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.635605 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635609 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635623 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.635631 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.635634 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635639 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.635643 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635650 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635654 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635657 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635668 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.635672 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635677 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.635680 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635684 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635693 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.635701 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.635704 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635709 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.635712 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635719 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635723 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635731 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.635735 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635740 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.635743 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635747 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635758 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.635773 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.635777 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635781 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.635786 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.635790 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635793 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635795 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635802 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.635805 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635810 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.635815 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635823 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635830 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.635838 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.635841 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635845 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.635848 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635852 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635855 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635864 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635868 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635872 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635875 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635881 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.635885 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635890 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.635893 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635897 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635905 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.635912 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.635915 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635920 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.635926 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.635930 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.635934 1195787 command_runner.go:130] >       },
	I1218 00:37:35.635938 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.635941 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.635944 1195787 command_runner.go:130] >     },
	I1218 00:37:35.635947 1195787 command_runner.go:130] >     {
	I1218 00:37:35.635954 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.635957 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.635963 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.635966 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635970 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.635978 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.635986 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.635989 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.635993 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.635997 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636000 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636003 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636007 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636011 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636013 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636016 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636022 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.636027 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636032 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.636035 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636039 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636046 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.636054 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.636057 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636060 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.636064 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636073 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636077 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636079 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636086 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.636090 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636095 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.636098 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636102 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636110 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.636125 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.636133 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636137 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.636140 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636144 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.636147 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636151 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636154 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.636158 1195787 command_runner.go:130] >     },
	I1218 00:37:35.636160 1195787 command_runner.go:130] >     {
	I1218 00:37:35.636166 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.636170 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.636175 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.636178 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636182 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.636190 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.636197 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.636200 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.636204 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.636208 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.636211 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.636214 1195787 command_runner.go:130] >       },
	I1218 00:37:35.636238 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.636243 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.636251 1195787 command_runner.go:130] >     }
	I1218 00:37:35.636254 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.636256 1195787 command_runner.go:130] > }
	I1218 00:37:35.636431 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.636439 1195787 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:37:35.636495 1195787 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:37:35.658094 1195787 command_runner.go:130] > {
	I1218 00:37:35.658111 1195787 command_runner.go:130] >   "images":  [
	I1218 00:37:35.658115 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658124 1195787 command_runner.go:130] >       "id":  "b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1218 00:37:35.658128 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658134 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1218 00:37:35.658137 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658141 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658151 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a",
	I1218 00:37:35.658159 1195787 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"
	I1218 00:37:35.658163 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658167 1195787 command_runner.go:130] >       "size":  "111333938",
	I1218 00:37:35.658171 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658176 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658179 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658182 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658189 1195787 command_runner.go:130] >       "id":  "ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1218 00:37:35.658192 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658198 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1218 00:37:35.658201 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658205 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658213 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2",
	I1218 00:37:35.658222 1195787 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1218 00:37:35.658225 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658229 1195787 command_runner.go:130] >       "size":  "29037500",
	I1218 00:37:35.658233 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658242 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658250 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658262 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658269 1195787 command_runner.go:130] >       "id":  "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1218 00:37:35.658273 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658279 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1218 00:37:35.658282 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658286 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658294 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6",
	I1218 00:37:35.658302 1195787 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"
	I1218 00:37:35.658305 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658309 1195787 command_runner.go:130] >       "size":  "74491780",
	I1218 00:37:35.658313 1195787 command_runner.go:130] >       "username":  "nonroot",
	I1218 00:37:35.658317 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658321 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658323 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658330 1195787 command_runner.go:130] >       "id":  "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1218 00:37:35.658334 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658339 1195787 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1218 00:37:35.658344 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658348 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658356 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890",
	I1218 00:37:35.658367 1195787 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"
	I1218 00:37:35.658370 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658374 1195787 command_runner.go:130] >       "size":  "60850387",
	I1218 00:37:35.658378 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658382 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658384 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658393 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658397 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658400 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658403 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658410 1195787 command_runner.go:130] >       "id":  "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1218 00:37:35.658413 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658425 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1218 00:37:35.658431 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658435 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658443 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee",
	I1218 00:37:35.658455 1195787 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"
	I1218 00:37:35.658465 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658469 1195787 command_runner.go:130] >       "size":  "85015535",
	I1218 00:37:35.658472 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658476 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658479 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658483 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658487 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658490 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658493 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658499 1195787 command_runner.go:130] >       "id":  "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1218 00:37:35.658503 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658508 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1218 00:37:35.658511 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658515 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658523 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f",
	I1218 00:37:35.658532 1195787 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1218 00:37:35.658535 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658539 1195787 command_runner.go:130] >       "size":  "72170325",
	I1218 00:37:35.658543 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658549 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658552 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658556 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658560 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658563 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658566 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658572 1195787 command_runner.go:130] >       "id":  "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1218 00:37:35.658577 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658582 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1218 00:37:35.658589 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658598 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658605 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f",
	I1218 00:37:35.658613 1195787 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1218 00:37:35.658616 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658620 1195787 command_runner.go:130] >       "size":  "74107287",
	I1218 00:37:35.658624 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658628 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658631 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658642 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658650 1195787 command_runner.go:130] >       "id":  "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1218 00:37:35.658653 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658659 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1218 00:37:35.658662 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658666 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658677 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3",
	I1218 00:37:35.658694 1195787 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"
	I1218 00:37:35.658697 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658701 1195787 command_runner.go:130] >       "size":  "49822549",
	I1218 00:37:35.658705 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658708 1195787 command_runner.go:130] >         "value":  "0"
	I1218 00:37:35.658711 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658715 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658718 1195787 command_runner.go:130] >       "pinned":  false
	I1218 00:37:35.658721 1195787 command_runner.go:130] >     },
	I1218 00:37:35.658725 1195787 command_runner.go:130] >     {
	I1218 00:37:35.658731 1195787 command_runner.go:130] >       "id":  "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1218 00:37:35.658734 1195787 command_runner.go:130] >       "repoTags":  [
	I1218 00:37:35.658739 1195787 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.658742 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658746 1195787 command_runner.go:130] >       "repoDigests":  [
	I1218 00:37:35.658754 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c",
	I1218 00:37:35.658761 1195787 command_runner.go:130] >         "registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"
	I1218 00:37:35.658772 1195787 command_runner.go:130] >       ],
	I1218 00:37:35.658777 1195787 command_runner.go:130] >       "size":  "519884",
	I1218 00:37:35.658781 1195787 command_runner.go:130] >       "uid":  {
	I1218 00:37:35.658784 1195787 command_runner.go:130] >         "value":  "65535"
	I1218 00:37:35.658788 1195787 command_runner.go:130] >       },
	I1218 00:37:35.658791 1195787 command_runner.go:130] >       "username":  "",
	I1218 00:37:35.658794 1195787 command_runner.go:130] >       "pinned":  true
	I1218 00:37:35.658798 1195787 command_runner.go:130] >     }
	I1218 00:37:35.658800 1195787 command_runner.go:130] >   ]
	I1218 00:37:35.658803 1195787 command_runner.go:130] > }
	I1218 00:37:35.660205 1195787 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:37:35.660262 1195787 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:37:35.660279 1195787 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:37:35.660385 1195787 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:37:35.660470 1195787 ssh_runner.go:195] Run: crio config
	I1218 00:37:35.707278 1195787 command_runner.go:130] > # The CRI-O configuration file specifies all of the available configuration
	I1218 00:37:35.707300 1195787 command_runner.go:130] > # options and command-line flags for the crio(8) OCI Kubernetes Container Runtime
	I1218 00:37:35.707307 1195787 command_runner.go:130] > # daemon, but in a TOML format that can be more easily modified and versioned.
	I1218 00:37:35.707310 1195787 command_runner.go:130] > #
	I1218 00:37:35.707318 1195787 command_runner.go:130] > # Please refer to crio.conf(5) for details of all configuration options.
	I1218 00:37:35.707324 1195787 command_runner.go:130] > # CRI-O supports partial configuration reload during runtime, which can be
	I1218 00:37:35.707330 1195787 command_runner.go:130] > # done by sending SIGHUP to the running process. Currently supported options
	I1218 00:37:35.707346 1195787 command_runner.go:130] > # are explicitly mentioned with: 'This option supports live configuration
	I1218 00:37:35.707349 1195787 command_runner.go:130] > # reload'.
	I1218 00:37:35.707356 1195787 command_runner.go:130] > # CRI-O reads its storage defaults from the containers-storage.conf(5) file
	I1218 00:37:35.707362 1195787 command_runner.go:130] > # located at /etc/containers/storage.conf. Modify this storage configuration if
	I1218 00:37:35.707368 1195787 command_runner.go:130] > # you want to change the system's defaults. If you want to modify storage just
	I1218 00:37:35.707383 1195787 command_runner.go:130] > # for CRI-O, you can change the storage configuration options here.
	I1218 00:37:35.707387 1195787 command_runner.go:130] > [crio]
	I1218 00:37:35.707393 1195787 command_runner.go:130] > # Path to the "root directory". CRI-O stores all of its data, including
	I1218 00:37:35.707398 1195787 command_runner.go:130] > # containers images, in this directory.
	I1218 00:37:35.707595 1195787 command_runner.go:130] > # root = "/home/docker/.local/share/containers/storage"
	I1218 00:37:35.707607 1195787 command_runner.go:130] > # Path to the "run directory". CRI-O stores all of its state in this directory.
	I1218 00:37:35.707620 1195787 command_runner.go:130] > # runroot = "/tmp/storage-run-1000/containers"
	I1218 00:37:35.707627 1195787 command_runner.go:130] > # Path to the "imagestore". If CRI-O stores all of its images in this directory differently than Root.
	I1218 00:37:35.707631 1195787 command_runner.go:130] > # imagestore = ""
	I1218 00:37:35.707637 1195787 command_runner.go:130] > # Storage driver used to manage the storage of images and containers. Please
	I1218 00:37:35.707643 1195787 command_runner.go:130] > # refer to containers-storage.conf(5) to see all available storage drivers.
	I1218 00:37:35.707768 1195787 command_runner.go:130] > # storage_driver = "overlay"
	I1218 00:37:35.707777 1195787 command_runner.go:130] > # List to pass options to the storage driver. Please refer to
	I1218 00:37:35.707784 1195787 command_runner.go:130] > # containers-storage.conf(5) to see all available storage options.
	I1218 00:37:35.707788 1195787 command_runner.go:130] > # storage_option = [
	I1218 00:37:35.707935 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.707952 1195787 command_runner.go:130] > # The default log directory where all logs will go unless directly specified by
	I1218 00:37:35.707959 1195787 command_runner.go:130] > # the kubelet. The log directory specified must be an absolute directory.
	I1218 00:37:35.707971 1195787 command_runner.go:130] > # log_dir = "/var/log/crio/pods"
	I1218 00:37:35.707978 1195787 command_runner.go:130] > # Location for CRI-O to lay down the temporary version file.
	I1218 00:37:35.707984 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe containers, which should
	I1218 00:37:35.707990 1195787 command_runner.go:130] > # always happen on a node reboot
	I1218 00:37:35.708138 1195787 command_runner.go:130] > # version_file = "/var/run/crio/version"
	I1218 00:37:35.708160 1195787 command_runner.go:130] > # Location for CRI-O to lay down the persistent version file.
	I1218 00:37:35.708174 1195787 command_runner.go:130] > # It is used to check if crio wipe should wipe images, which should
	I1218 00:37:35.708183 1195787 command_runner.go:130] > # only happen when CRI-O has been upgraded
	I1218 00:37:35.708342 1195787 command_runner.go:130] > # version_file_persist = ""
	I1218 00:37:35.708354 1195787 command_runner.go:130] > # InternalWipe is whether CRI-O should wipe containers and images after a reboot when the server starts.
	I1218 00:37:35.708363 1195787 command_runner.go:130] > # If set to false, one must use the external command 'crio wipe' to wipe the containers and images in these situations.
	I1218 00:37:35.708367 1195787 command_runner.go:130] > # internal_wipe = true
	I1218 00:37:35.708381 1195787 command_runner.go:130] > # InternalRepair is whether CRI-O should check if the container and image storage was corrupted after a sudden restart.
	I1218 00:37:35.708388 1195787 command_runner.go:130] > # If it was, CRI-O also attempts to repair the storage.
	I1218 00:37:35.708503 1195787 command_runner.go:130] > # internal_repair = true
	I1218 00:37:35.708512 1195787 command_runner.go:130] > # Location for CRI-O to lay down the clean shutdown file.
	I1218 00:37:35.708519 1195787 command_runner.go:130] > # It is used to check whether crio had time to sync before shutting down.
	I1218 00:37:35.708525 1195787 command_runner.go:130] > # If not found, crio wipe will clear the storage directory.
	I1218 00:37:35.708671 1195787 command_runner.go:130] > # clean_shutdown_file = "/var/lib/crio/clean.shutdown"
	I1218 00:37:35.708682 1195787 command_runner.go:130] > # The crio.api table contains settings for the kubelet/gRPC interface.
	I1218 00:37:35.708686 1195787 command_runner.go:130] > [crio.api]
	I1218 00:37:35.708706 1195787 command_runner.go:130] > # Path to AF_LOCAL socket on which CRI-O will listen.
	I1218 00:37:35.708833 1195787 command_runner.go:130] > # listen = "/var/run/crio/crio.sock"
	I1218 00:37:35.708843 1195787 command_runner.go:130] > # IP address on which the stream server will listen.
	I1218 00:37:35.708997 1195787 command_runner.go:130] > # stream_address = "127.0.0.1"
	I1218 00:37:35.709007 1195787 command_runner.go:130] > # The port on which the stream server will listen. If the port is set to "0", then
	I1218 00:37:35.709013 1195787 command_runner.go:130] > # CRI-O will allocate a random free port number.
	I1218 00:37:35.709016 1195787 command_runner.go:130] > # stream_port = "0"
	I1218 00:37:35.709022 1195787 command_runner.go:130] > # Enable encrypted TLS transport of the stream server.
	I1218 00:37:35.709150 1195787 command_runner.go:130] > # stream_enable_tls = false
	I1218 00:37:35.709160 1195787 command_runner.go:130] > # Length of time until open streams terminate due to lack of activity
	I1218 00:37:35.709282 1195787 command_runner.go:130] > # stream_idle_timeout = ""
	I1218 00:37:35.709292 1195787 command_runner.go:130] > # Path to the x509 certificate file used to serve the encrypted stream. This
	I1218 00:37:35.709298 1195787 command_runner.go:130] > # file can change, and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709420 1195787 command_runner.go:130] > # stream_tls_cert = ""
	I1218 00:37:35.709430 1195787 command_runner.go:130] > # Path to the key file used to serve the encrypted stream. This file can
	I1218 00:37:35.709436 1195787 command_runner.go:130] > # change and CRI-O will automatically pick up the changes.
	I1218 00:37:35.709440 1195787 command_runner.go:130] > # stream_tls_key = ""
	I1218 00:37:35.709447 1195787 command_runner.go:130] > # Path to the x509 CA(s) file used to verify and authenticate client
	I1218 00:37:35.709453 1195787 command_runner.go:130] > # communication with the encrypted stream. This file can change and CRI-O will
	I1218 00:37:35.709462 1195787 command_runner.go:130] > # automatically pick up the changes.
	I1218 00:37:35.709593 1195787 command_runner.go:130] > # stream_tls_ca = ""
	I1218 00:37:35.709614 1195787 command_runner.go:130] > # Maximum grpc send message size in bytes. If not set or <=0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709735 1195787 command_runner.go:130] > # grpc_max_send_msg_size = 83886080
	I1218 00:37:35.709746 1195787 command_runner.go:130] > # Maximum grpc receive message size. If not set or <= 0, then CRI-O will default to 80 * 1024 * 1024.
	I1218 00:37:35.709864 1195787 command_runner.go:130] > # grpc_max_recv_msg_size = 83886080
	I1218 00:37:35.709875 1195787 command_runner.go:130] > # The crio.runtime table contains settings pertaining to the OCI runtime used
	I1218 00:37:35.709881 1195787 command_runner.go:130] > # and options for how to set up and manage the OCI runtime.
	I1218 00:37:35.709885 1195787 command_runner.go:130] > [crio.runtime]
	I1218 00:37:35.709891 1195787 command_runner.go:130] > # A list of ulimits to be set in containers by default, specified as
	I1218 00:37:35.709896 1195787 command_runner.go:130] > # "<ulimit name>=<soft limit>:<hard limit>", for example:
	I1218 00:37:35.709907 1195787 command_runner.go:130] > # "nofile=1024:2048"
	I1218 00:37:35.709913 1195787 command_runner.go:130] > # If nothing is set here, settings will be inherited from the CRI-O daemon
	I1218 00:37:35.709917 1195787 command_runner.go:130] > # default_ulimits = [
	I1218 00:37:35.710017 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710026 1195787 command_runner.go:130] > # If true, the runtime will not use pivot_root, but instead use MS_MOVE.
	I1218 00:37:35.710154 1195787 command_runner.go:130] > # no_pivot = false
	I1218 00:37:35.710163 1195787 command_runner.go:130] > # decryption_keys_path is the path where the keys required for
	I1218 00:37:35.710170 1195787 command_runner.go:130] > # image decryption are stored. This option supports live configuration reload.
	I1218 00:37:35.710300 1195787 command_runner.go:130] > # decryption_keys_path = "/etc/crio/keys/"
	I1218 00:37:35.710309 1195787 command_runner.go:130] > # Path to the conmon binary, used for monitoring the OCI runtime.
	I1218 00:37:35.710323 1195787 command_runner.go:130] > # Will be searched for using $PATH if empty.
	I1218 00:37:35.710336 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710476 1195787 command_runner.go:130] > # conmon = ""
	I1218 00:37:35.710485 1195787 command_runner.go:130] > # Cgroup setting for conmon
	I1218 00:37:35.710492 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorCgroup.
	I1218 00:37:35.710496 1195787 command_runner.go:130] > conmon_cgroup = "pod"
	I1218 00:37:35.710508 1195787 command_runner.go:130] > # Environment variable list for the conmon process, used for passing necessary
	I1218 00:37:35.710514 1195787 command_runner.go:130] > # environment variables to conmon or the runtime.
	I1218 00:37:35.710521 1195787 command_runner.go:130] > # This option is currently deprecated, and will be replaced with RuntimeHandler.MonitorEnv.
	I1218 00:37:35.710524 1195787 command_runner.go:130] > # conmon_env = [
	I1218 00:37:35.710624 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710633 1195787 command_runner.go:130] > # Additional environment variables to set for all the
	I1218 00:37:35.710639 1195787 command_runner.go:130] > # containers. These are overridden if set in the
	I1218 00:37:35.710644 1195787 command_runner.go:130] > # container image spec or in the container runtime configuration.
	I1218 00:37:35.710648 1195787 command_runner.go:130] > # default_env = [
	I1218 00:37:35.710790 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.710800 1195787 command_runner.go:130] > # If true, SELinux will be used for pod separation on the host.
	I1218 00:37:35.710816 1195787 command_runner.go:130] > # This option is deprecated, and be interpreted from whether SELinux is enabled on the host in the future.
	I1218 00:37:35.710953 1195787 command_runner.go:130] > # selinux = false
	I1218 00:37:35.710964 1195787 command_runner.go:130] > # Path to the seccomp.json profile which is used as the default seccomp profile
	I1218 00:37:35.710972 1195787 command_runner.go:130] > # for the runtime. If not specified or set to "", then the internal default seccomp profile will be used.
	I1218 00:37:35.710977 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.710981 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.710993 1195787 command_runner.go:130] > # Enable a seccomp profile for privileged containers from the local path.
	I1218 00:37:35.710999 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711131 1195787 command_runner.go:130] > # privileged_seccomp_profile = ""
	I1218 00:37:35.711142 1195787 command_runner.go:130] > # Used to change the name of the default AppArmor profile of CRI-O. The default
	I1218 00:37:35.711149 1195787 command_runner.go:130] > # profile name is "crio-default". This profile only takes effect if the user
	I1218 00:37:35.711162 1195787 command_runner.go:130] > # does not specify a profile via the Kubernetes Pod's metadata annotation. If
	I1218 00:37:35.711169 1195787 command_runner.go:130] > # the profile is set to "unconfined", then this equals to disabling AppArmor.
	I1218 00:37:35.711174 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.711345 1195787 command_runner.go:130] > # apparmor_profile = "crio-default"
	I1218 00:37:35.711373 1195787 command_runner.go:130] > # Path to the blockio class configuration file for configuring
	I1218 00:37:35.711401 1195787 command_runner.go:130] > # the cgroup blockio controller.
	I1218 00:37:35.711419 1195787 command_runner.go:130] > # blockio_config_file = ""
	I1218 00:37:35.711456 1195787 command_runner.go:130] > # Reload blockio-config-file and rescan blockio devices in the system before applying
	I1218 00:37:35.711477 1195787 command_runner.go:130] > # blockio parameters.
	I1218 00:37:35.711667 1195787 command_runner.go:130] > # blockio_reload = false
	I1218 00:37:35.711706 1195787 command_runner.go:130] > # Used to change irqbalance service config file path which is used for configuring
	I1218 00:37:35.711725 1195787 command_runner.go:130] > # irqbalance daemon.
	I1218 00:37:35.711743 1195787 command_runner.go:130] > # irqbalance_config_file = "/etc/sysconfig/irqbalance"
	I1218 00:37:35.711776 1195787 command_runner.go:130] > # irqbalance_config_restore_file allows to set a cpu mask CRI-O should
	I1218 00:37:35.711801 1195787 command_runner.go:130] > # restore as irqbalance config at startup. Set to empty string to disable this flow entirely.
	I1218 00:37:35.711821 1195787 command_runner.go:130] > # By default, CRI-O manages the irqbalance configuration to enable dynamic IRQ pinning.
	I1218 00:37:35.711855 1195787 command_runner.go:130] > # irqbalance_config_restore_file = "/etc/sysconfig/orig_irq_banned_cpus"
	I1218 00:37:35.711879 1195787 command_runner.go:130] > # Path to the RDT configuration file for configuring the resctrl pseudo-filesystem.
	I1218 00:37:35.711898 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.712052 1195787 command_runner.go:130] > # rdt_config_file = ""
	I1218 00:37:35.712092 1195787 command_runner.go:130] > # Cgroup management implementation used for the runtime.
	I1218 00:37:35.712112 1195787 command_runner.go:130] > cgroup_manager = "cgroupfs"
	I1218 00:37:35.712133 1195787 command_runner.go:130] > # Specify whether the image pull must be performed in a separate cgroup.
	I1218 00:37:35.712151 1195787 command_runner.go:130] > # separate_pull_cgroup = ""
	I1218 00:37:35.712187 1195787 command_runner.go:130] > # List of default capabilities for containers. If it is empty or commented out,
	I1218 00:37:35.712206 1195787 command_runner.go:130] > # only the capabilities defined in the containers json file by the user/kube
	I1218 00:37:35.712253 1195787 command_runner.go:130] > # will be added.
	I1218 00:37:35.712276 1195787 command_runner.go:130] > # default_capabilities = [
	I1218 00:37:35.712420 1195787 command_runner.go:130] > # 	"CHOWN",
	I1218 00:37:35.712461 1195787 command_runner.go:130] > # 	"DAC_OVERRIDE",
	I1218 00:37:35.712541 1195787 command_runner.go:130] > # 	"FSETID",
	I1218 00:37:35.712631 1195787 command_runner.go:130] > # 	"FOWNER",
	I1218 00:37:35.712660 1195787 command_runner.go:130] > # 	"SETGID",
	I1218 00:37:35.712794 1195787 command_runner.go:130] > # 	"SETUID",
	I1218 00:37:35.712896 1195787 command_runner.go:130] > # 	"SETPCAP",
	I1218 00:37:35.712994 1195787 command_runner.go:130] > # 	"NET_BIND_SERVICE",
	I1218 00:37:35.713065 1195787 command_runner.go:130] > # 	"KILL",
	I1218 00:37:35.713149 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.713172 1195787 command_runner.go:130] > # Add capabilities to the inheritable set, as well as the default group of permitted, bounding and effective.
	I1218 00:37:35.713258 1195787 command_runner.go:130] > # If capabilities are expected to work for non-root users, this option should be set.
	I1218 00:37:35.713410 1195787 command_runner.go:130] > # add_inheritable_capabilities = false
	I1218 00:37:35.713489 1195787 command_runner.go:130] > # List of default sysctls. If it is empty or commented out, only the sysctls
	I1218 00:37:35.713545 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.713716 1195787 command_runner.go:130] > default_sysctls = [
	I1218 00:37:35.713734 1195787 command_runner.go:130] > 	"net.ipv4.ip_unprivileged_port_start=0",
	I1218 00:37:35.713949 1195787 command_runner.go:130] > ]
	I1218 00:37:35.713959 1195787 command_runner.go:130] > # List of devices on the host that a
	I1218 00:37:35.713966 1195787 command_runner.go:130] > # user can specify with the "io.kubernetes.cri-o.Devices" allowed annotation.
	I1218 00:37:35.713970 1195787 command_runner.go:130] > # allowed_devices = [
	I1218 00:37:35.713995 1195787 command_runner.go:130] > # 	"/dev/fuse",
	I1218 00:37:35.714000 1195787 command_runner.go:130] > # 	"/dev/net/tun",
	I1218 00:37:35.714003 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714008 1195787 command_runner.go:130] > # List of additional devices. specified as
	I1218 00:37:35.714016 1195787 command_runner.go:130] > # "<device-on-host>:<device-on-container>:<permissions>", for example: "--device=/dev/sdc:/dev/xvdc:rwm".
	I1218 00:37:35.714022 1195787 command_runner.go:130] > # If it is empty or commented out, only the devices
	I1218 00:37:35.714028 1195787 command_runner.go:130] > # defined in the container json file by the user/kube will be added.
	I1218 00:37:35.714032 1195787 command_runner.go:130] > # additional_devices = [
	I1218 00:37:35.714035 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714040 1195787 command_runner.go:130] > # List of directories to scan for CDI Spec files.
	I1218 00:37:35.714044 1195787 command_runner.go:130] > # cdi_spec_dirs = [
	I1218 00:37:35.714048 1195787 command_runner.go:130] > # 	"/etc/cdi",
	I1218 00:37:35.714052 1195787 command_runner.go:130] > # 	"/var/run/cdi",
	I1218 00:37:35.714056 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.714062 1195787 command_runner.go:130] > # Change the default behavior of setting container devices uid/gid from CRI's
	I1218 00:37:35.714068 1195787 command_runner.go:130] > # SecurityContext (RunAsUser/RunAsGroup) instead of taking host's uid/gid.
	I1218 00:37:35.714077 1195787 command_runner.go:130] > # Defaults to false.
	I1218 00:37:35.714083 1195787 command_runner.go:130] > # device_ownership_from_security_context = false
	I1218 00:37:35.714089 1195787 command_runner.go:130] > # Path to OCI hooks directories for automatically executed hooks. If one of the
	I1218 00:37:35.714100 1195787 command_runner.go:130] > # directories does not exist, then CRI-O will automatically skip them.
	I1218 00:37:35.714537 1195787 command_runner.go:130] > # hooks_dir = [
	I1218 00:37:35.714675 1195787 command_runner.go:130] > # 	"/usr/share/containers/oci/hooks.d",
	I1218 00:37:35.714791 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.715258 1195787 command_runner.go:130] > # Path to the file specifying the defaults mounts for each container. The
	I1218 00:37:35.715414 1195787 command_runner.go:130] > # format of the config is /SRC:/DST, one mount per line. Notice that CRI-O reads
	I1218 00:37:35.715601 1195787 command_runner.go:130] > # its default mounts from the following two files:
	I1218 00:37:35.715650 1195787 command_runner.go:130] > #
	I1218 00:37:35.715843 1195787 command_runner.go:130] > #   1) /etc/containers/mounts.conf (i.e., default_mounts_file): This is the
	I1218 00:37:35.715943 1195787 command_runner.go:130] > #      override file, where users can either add in their own default mounts, or
	I1218 00:37:35.716060 1195787 command_runner.go:130] > #      override the default mounts shipped with the package.
	I1218 00:37:35.716083 1195787 command_runner.go:130] > #
	I1218 00:37:35.716111 1195787 command_runner.go:130] > #   2) /usr/share/containers/mounts.conf: This is the default file read for
	I1218 00:37:35.716131 1195787 command_runner.go:130] > #      mounts. If you want CRI-O to read from a different, specific mounts file,
	I1218 00:37:35.716166 1195787 command_runner.go:130] > #      you can change the default_mounts_file. Note, if this is done, CRI-O will
	I1218 00:37:35.716187 1195787 command_runner.go:130] > #      only add mounts it finds in this file.
	I1218 00:37:35.716204 1195787 command_runner.go:130] > #
	I1218 00:37:35.716248 1195787 command_runner.go:130] > # default_mounts_file = ""
	I1218 00:37:35.716275 1195787 command_runner.go:130] > # Maximum number of processes allowed in a container.
	I1218 00:37:35.716306 1195787 command_runner.go:130] > # This option is deprecated. The Kubelet flag '--pod-pids-limit' should be used instead.
	I1218 00:37:35.717368 1195787 command_runner.go:130] > # pids_limit = -1
	I1218 00:37:35.717418 1195787 command_runner.go:130] > # Maximum sized allowed for the container log file. Negative numbers indicate
	I1218 00:37:35.717442 1195787 command_runner.go:130] > # that no size limit is imposed. If it is positive, it must be >= 8192 to
	I1218 00:37:35.717463 1195787 command_runner.go:130] > # match/exceed conmon's read buffer. The file is truncated and re-opened so the
	I1218 00:37:35.717499 1195787 command_runner.go:130] > # limit is never exceeded. This option is deprecated. The Kubelet flag '--container-log-max-size' should be used instead.
	I1218 00:37:35.717521 1195787 command_runner.go:130] > # log_size_max = -1
	I1218 00:37:35.717693 1195787 command_runner.go:130] > # Whether container output should be logged to journald in addition to the kubernetes log file
	I1218 00:37:35.717720 1195787 command_runner.go:130] > # log_to_journald = false
	I1218 00:37:35.717752 1195787 command_runner.go:130] > # Path to directory in which container exit files are written to by conmon.
	I1218 00:37:35.717776 1195787 command_runner.go:130] > # container_exits_dir = "/var/run/crio/exits"
	I1218 00:37:35.717810 1195787 command_runner.go:130] > # Path to directory for container attach sockets.
	I1218 00:37:35.717835 1195787 command_runner.go:130] > # container_attach_socket_dir = "/var/run/crio"
	I1218 00:37:35.717855 1195787 command_runner.go:130] > # The prefix to use for the source of the bind mounts.
	I1218 00:37:35.717888 1195787 command_runner.go:130] > # bind_mount_prefix = ""
	I1218 00:37:35.717911 1195787 command_runner.go:130] > # If set to true, all containers will run in read-only mode.
	I1218 00:37:35.717929 1195787 command_runner.go:130] > # read_only = false
	I1218 00:37:35.717949 1195787 command_runner.go:130] > # Changes the verbosity of the logs based on the level it is set to. Options
	I1218 00:37:35.717978 1195787 command_runner.go:130] > # are fatal, panic, error, warn, info, debug and trace. This option supports
	I1218 00:37:35.717999 1195787 command_runner.go:130] > # live configuration reload.
	I1218 00:37:35.718017 1195787 command_runner.go:130] > # log_level = "info"
	I1218 00:37:35.718039 1195787 command_runner.go:130] > # Filter the log messages by the provided regular expression.
	I1218 00:37:35.718073 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.718091 1195787 command_runner.go:130] > # log_filter = ""
	I1218 00:37:35.718112 1195787 command_runner.go:130] > # The UID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718144 1195787 command_runner.go:130] > # specified in the form containerUID:HostUID:Size. Multiple ranges must be
	I1218 00:37:35.718167 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718189 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718221 1195787 command_runner.go:130] > # uid_mappings = ""
	I1218 00:37:35.718243 1195787 command_runner.go:130] > # The GID mappings for the user namespace of each container. A range is
	I1218 00:37:35.718262 1195787 command_runner.go:130] > # specified in the form containerGID:HostGID:Size. Multiple ranges must be
	I1218 00:37:35.718280 1195787 command_runner.go:130] > # separated by comma.
	I1218 00:37:35.718311 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718343 1195787 command_runner.go:130] > # gid_mappings = ""
	I1218 00:37:35.718363 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host UIDs below this value
	I1218 00:37:35.718395 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718420 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718442 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718481 1195787 command_runner.go:130] > # minimum_mappable_uid = -1
	I1218 00:37:35.718507 1195787 command_runner.go:130] > # If set, CRI-O will reject any attempt to map host GIDs below this value
	I1218 00:37:35.718529 1195787 command_runner.go:130] > # into user namespaces.  A negative value indicates that no minimum is set,
	I1218 00:37:35.718561 1195787 command_runner.go:130] > # so specifying mappings will only be allowed for pods that run as UID 0.
	I1218 00:37:35.718589 1195787 command_runner.go:130] > # This option is deprecated, and will be replaced with Kubernetes user namespace support (KEP-127) in the future.
	I1218 00:37:35.718607 1195787 command_runner.go:130] > # minimum_mappable_gid = -1
	I1218 00:37:35.718641 1195787 command_runner.go:130] > # The minimal amount of time in seconds to wait before issuing a timeout
	I1218 00:37:35.718665 1195787 command_runner.go:130] > # regarding the proper termination of the container. The lowest possible
	I1218 00:37:35.718685 1195787 command_runner.go:130] > # value is 30s, whereas lower values are not considered by CRI-O.
	I1218 00:37:35.718717 1195787 command_runner.go:130] > # ctr_stop_timeout = 30
	I1218 00:37:35.718741 1195787 command_runner.go:130] > # drop_infra_ctr determines whether CRI-O drops the infra container
	I1218 00:37:35.718762 1195787 command_runner.go:130] > # when a pod does not have a private PID namespace, and does not use
	I1218 00:37:35.718793 1195787 command_runner.go:130] > # a kernel separating runtime (like kata).
	I1218 00:37:35.718814 1195787 command_runner.go:130] > # It requires manage_ns_lifecycle to be true.
	I1218 00:37:35.718831 1195787 command_runner.go:130] > # drop_infra_ctr = true
	I1218 00:37:35.718851 1195787 command_runner.go:130] > # infra_ctr_cpuset determines what CPUs will be used to run infra containers.
	I1218 00:37:35.718882 1195787 command_runner.go:130] > # You can use linux CPU list format to specify desired CPUs.
	I1218 00:37:35.718907 1195787 command_runner.go:130] > # To get better isolation for guaranteed pods, set this parameter to be equal to kubelet reserved-cpus.
	I1218 00:37:35.718931 1195787 command_runner.go:130] > # infra_ctr_cpuset = ""
	I1218 00:37:35.718965 1195787 command_runner.go:130] > # shared_cpuset  determines the CPU set which is allowed to be shared between guaranteed containers,
	I1218 00:37:35.718989 1195787 command_runner.go:130] > # regardless of, and in addition to, the exclusiveness of their CPUs.
	I1218 00:37:35.719009 1195787 command_runner.go:130] > # This field is optional and would not be used if not specified.
	I1218 00:37:35.719039 1195787 command_runner.go:130] > # You can specify CPUs in the Linux CPU list format.
	I1218 00:37:35.719315 1195787 command_runner.go:130] > # shared_cpuset = ""
	I1218 00:37:35.719348 1195787 command_runner.go:130] > # The directory where the state of the managed namespaces gets tracked.
	I1218 00:37:35.719365 1195787 command_runner.go:130] > # Only used when manage_ns_lifecycle is true.
	I1218 00:37:35.719396 1195787 command_runner.go:130] > # namespaces_dir = "/var/run"
	I1218 00:37:35.719423 1195787 command_runner.go:130] > # pinns_path is the path to find the pinns binary, which is needed to manage namespace lifecycle
	I1218 00:37:35.719450 1195787 command_runner.go:130] > # pinns_path = ""
	I1218 00:37:35.719484 1195787 command_runner.go:130] > # Globally enable/disable CRIU support which is necessary to
	I1218 00:37:35.719510 1195787 command_runner.go:130] > # checkpoint and restore container or pods (even if CRIU is found in $PATH).
	I1218 00:37:35.719528 1195787 command_runner.go:130] > # enable_criu_support = true
	I1218 00:37:35.719563 1195787 command_runner.go:130] > # Enable/disable the generation of the container,
	I1218 00:37:35.719586 1195787 command_runner.go:130] > # sandbox lifecycle events to be sent to the Kubelet to optimize the PLEG
	I1218 00:37:35.719602 1195787 command_runner.go:130] > # enable_pod_events = false
	I1218 00:37:35.719622 1195787 command_runner.go:130] > # default_runtime is the _name_ of the OCI runtime to be used as the default.
	I1218 00:37:35.719651 1195787 command_runner.go:130] > # The name is matched against the runtimes map below.
	I1218 00:37:35.719672 1195787 command_runner.go:130] > # default_runtime = "crun"
	I1218 00:37:35.719690 1195787 command_runner.go:130] > # A list of paths that, when absent from the host,
	I1218 00:37:35.719711 1195787 command_runner.go:130] > # will cause a container creation to fail (as opposed to the current behavior being created as a directory).
	I1218 00:37:35.719747 1195787 command_runner.go:130] > # This option is to protect from source locations whose existence as a directory could jeopardize the health of the node, and whose
	I1218 00:37:35.719770 1195787 command_runner.go:130] > # creation as a file is not desired either.
	I1218 00:37:35.719795 1195787 command_runner.go:130] > # An example is /etc/hostname, which will cause failures on reboot if it's created as a directory, but often doesn't exist because
	I1218 00:37:35.719826 1195787 command_runner.go:130] > # the hostname is being managed dynamically.
	I1218 00:37:35.719849 1195787 command_runner.go:130] > # absent_mount_sources_to_reject = [
	I1218 00:37:35.719865 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.719885 1195787 command_runner.go:130] > # The "crio.runtime.runtimes" table defines a list of OCI compatible runtimes.
	I1218 00:37:35.719916 1195787 command_runner.go:130] > # The runtime to use is picked based on the runtime handler provided by the CRI.
	I1218 00:37:35.719938 1195787 command_runner.go:130] > # If no runtime handler is provided, the "default_runtime" will be used.
	I1218 00:37:35.719957 1195787 command_runner.go:130] > # Each entry in the table should follow the format:
	I1218 00:37:35.719973 1195787 command_runner.go:130] > #
	I1218 00:37:35.720002 1195787 command_runner.go:130] > # [crio.runtime.runtimes.runtime-handler]
	I1218 00:37:35.720024 1195787 command_runner.go:130] > # runtime_path = "/path/to/the/executable"
	I1218 00:37:35.720041 1195787 command_runner.go:130] > # runtime_type = "oci"
	I1218 00:37:35.720059 1195787 command_runner.go:130] > # runtime_root = "/path/to/the/root"
	I1218 00:37:35.720096 1195787 command_runner.go:130] > # inherit_default_runtime = false
	I1218 00:37:35.720121 1195787 command_runner.go:130] > # monitor_path = "/path/to/container/monitor"
	I1218 00:37:35.720139 1195787 command_runner.go:130] > # monitor_cgroup = "/cgroup/path"
	I1218 00:37:35.720170 1195787 command_runner.go:130] > # monitor_exec_cgroup = "/cgroup/path"
	I1218 00:37:35.720190 1195787 command_runner.go:130] > # monitor_env = []
	I1218 00:37:35.720207 1195787 command_runner.go:130] > # privileged_without_host_devices = false
	I1218 00:37:35.720256 1195787 command_runner.go:130] > # allowed_annotations = []
	I1218 00:37:35.720274 1195787 command_runner.go:130] > # platform_runtime_paths = { "os/arch" = "/path/to/binary" }
	I1218 00:37:35.720279 1195787 command_runner.go:130] > # no_sync_log = false
	I1218 00:37:35.720284 1195787 command_runner.go:130] > # default_annotations = {}
	I1218 00:37:35.720288 1195787 command_runner.go:130] > # stream_websockets = false
	I1218 00:37:35.720292 1195787 command_runner.go:130] > # seccomp_profile = ""
	I1218 00:37:35.720348 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.720360 1195787 command_runner.go:130] > # - runtime-handler: Name used to identify the runtime.
	I1218 00:37:35.720367 1195787 command_runner.go:130] > # - runtime_path (optional, string): Absolute path to the runtime executable in
	I1218 00:37:35.720386 1195787 command_runner.go:130] > #   the host filesystem. If omitted, the runtime-handler identifier should match
	I1218 00:37:35.720399 1195787 command_runner.go:130] > #   the runtime executable name, and the runtime executable should be placed
	I1218 00:37:35.720403 1195787 command_runner.go:130] > #   in $PATH.
	I1218 00:37:35.720418 1195787 command_runner.go:130] > # - runtime_type (optional, string): Type of runtime, one of: "oci", "vm". If
	I1218 00:37:35.720433 1195787 command_runner.go:130] > #   omitted, an "oci" runtime is assumed.
	I1218 00:37:35.720439 1195787 command_runner.go:130] > # - runtime_root (optional, string): Root directory for storage of containers
	I1218 00:37:35.720444 1195787 command_runner.go:130] > #   state.
	I1218 00:37:35.720451 1195787 command_runner.go:130] > # - runtime_config_path (optional, string): the path for the runtime configuration
	I1218 00:37:35.720460 1195787 command_runner.go:130] > #   file. This can only be used with when using the VM runtime_type.
	I1218 00:37:35.720466 1195787 command_runner.go:130] > # - inherit_default_runtime (optional, bool): when true the runtime_path,
	I1218 00:37:35.720473 1195787 command_runner.go:130] > #   runtime_type, runtime_root and runtime_config_path will be replaced by
	I1218 00:37:35.720480 1195787 command_runner.go:130] > #   the values from the default runtime on load time.
	I1218 00:37:35.720496 1195787 command_runner.go:130] > # - privileged_without_host_devices (optional, bool): an option for restricting
	I1218 00:37:35.720506 1195787 command_runner.go:130] > #   host devices from being passed to privileged containers.
	I1218 00:37:35.720513 1195787 command_runner.go:130] > # - allowed_annotations (optional, array of strings): an option for specifying
	I1218 00:37:35.720531 1195787 command_runner.go:130] > #   a list of experimental annotations that this runtime handler is allowed to process.
	I1218 00:37:35.720543 1195787 command_runner.go:130] > #   The currently recognized values are:
	I1218 00:37:35.720550 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.userns-mode" for configuring a user namespace for the pod.
	I1218 00:37:35.720566 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.cgroup2-mount-hierarchy-rw" for mounting cgroups writably when set to "true".
	I1218 00:37:35.720576 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.Devices" for configuring devices for the pod.
	I1218 00:37:35.720582 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.ShmSize" for configuring the size of /dev/shm.
	I1218 00:37:35.720590 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.UnifiedCgroup.$CTR_NAME" for configuring the cgroup v2 unified block for a container.
	I1218 00:37:35.720628 1195787 command_runner.go:130] > #   "io.containers.trace-syscall" for tracing syscalls via the OCI seccomp BPF hook.
	I1218 00:37:35.720649 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.seccompNotifierAction" for enabling the seccomp notifier feature.
	I1218 00:37:35.720665 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.umask" for setting the umask for container init process.
	I1218 00:37:35.720671 1195787 command_runner.go:130] > #   "io.kubernetes.cri.rdt-class" for setting the RDT class of a container
	I1218 00:37:35.720679 1195787 command_runner.go:130] > #   "seccomp-profile.kubernetes.cri-o.io" for setting the seccomp profile for:
	I1218 00:37:35.720689 1195787 command_runner.go:130] > #     - a specific container by using: "seccomp-profile.kubernetes.cri-o.io/<CONTAINER_NAME>"
	I1218 00:37:35.720706 1195787 command_runner.go:130] > #     - a whole pod by using: "seccomp-profile.kubernetes.cri-o.io/POD"
	I1218 00:37:35.720719 1195787 command_runner.go:130] > #     Note that the annotation works on containers as well as on images.
	I1218 00:37:35.720733 1195787 command_runner.go:130] > #     For images, the plain annotation "seccomp-profile.kubernetes.cri-o.io"
	I1218 00:37:35.720746 1195787 command_runner.go:130] > #     can be used without the required "/POD" suffix or a container name.
	I1218 00:37:35.720754 1195787 command_runner.go:130] > #   "io.kubernetes.cri-o.DisableFIPS" for disabling FIPS mode in a Kubernetes pod within a FIPS-enabled cluster.
	I1218 00:37:35.720760 1195787 command_runner.go:130] > # - monitor_path (optional, string): The path of the monitor binary. Replaces
	I1218 00:37:35.720764 1195787 command_runner.go:130] > #   deprecated option "conmon".
	I1218 00:37:35.720772 1195787 command_runner.go:130] > # - monitor_cgroup (optional, string): The cgroup the container monitor process will be put in.
	I1218 00:37:35.720777 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_cgroup".
	I1218 00:37:35.720783 1195787 command_runner.go:130] > # - monitor_exec_cgroup (optional, string): If set to "container", indicates exec probes
	I1218 00:37:35.720795 1195787 command_runner.go:130] > #   should be moved to the container's cgroup
	I1218 00:37:35.720813 1195787 command_runner.go:130] > # - monitor_env (optional, array of strings): Environment variables to pass to the monitor.
	I1218 00:37:35.720825 1195787 command_runner.go:130] > #   Replaces deprecated option "conmon_env".
	I1218 00:37:35.720833 1195787 command_runner.go:130] > #   When using the pod runtime and conmon-rs, then the monitor_env can be used to further configure
	I1218 00:37:35.720849 1195787 command_runner.go:130] > #   conmon-rs by using:
	I1218 00:37:35.720857 1195787 command_runner.go:130] > #     - LOG_DRIVER=[none,systemd,stdout] - Enable logging to the configured target, defaults to none.
	I1218 00:37:35.720865 1195787 command_runner.go:130] > #     - HEAPTRACK_OUTPUT_PATH=/path/to/dir - Enable heaptrack profiling and save the files to the set directory.
	I1218 00:37:35.720875 1195787 command_runner.go:130] > #     - HEAPTRACK_BINARY_PATH=/path/to/heaptrack - Enable heaptrack profiling and use set heaptrack binary.
	I1218 00:37:35.720882 1195787 command_runner.go:130] > # - platform_runtime_paths (optional, map): A mapping of platforms to the corresponding
	I1218 00:37:35.720888 1195787 command_runner.go:130] > #   runtime executable paths for the runtime handler.
	I1218 00:37:35.720897 1195787 command_runner.go:130] > # - container_min_memory (optional, string): The minimum memory that must be set for a container.
	I1218 00:37:35.720905 1195787 command_runner.go:130] > #   This value can be used to override the currently set global value for a specific runtime. If not set,
	I1218 00:37:35.720932 1195787 command_runner.go:130] > #   a global default value of "12 MiB" will be used.
	I1218 00:37:35.720946 1195787 command_runner.go:130] > # - no_sync_log (optional, bool): If set to true, the runtime will not sync the log file on rotate or container exit.
	I1218 00:37:35.720960 1195787 command_runner.go:130] > #   This option is only valid for the 'oci' runtime type. Setting this option to true can cause data loss, e.g.
	I1218 00:37:35.720968 1195787 command_runner.go:130] > #   when a machine crash happens.
	I1218 00:37:35.720975 1195787 command_runner.go:130] > # - default_annotations (optional, map): Default annotations if not overridden by the pod spec.
	I1218 00:37:35.720983 1195787 command_runner.go:130] > # - stream_websockets (optional, bool): Enable the WebSocket protocol for container exec, attach and port forward.
	I1218 00:37:35.720995 1195787 command_runner.go:130] > # - seccomp_profile (optional, string): The absolute path of the seccomp.json profile which is used as the default
	I1218 00:37:35.721013 1195787 command_runner.go:130] > #   seccomp profile for the runtime.
	I1218 00:37:35.721023 1195787 command_runner.go:130] > #   If not specified or set to "", the runtime seccomp_profile will be used.
	I1218 00:37:35.721031 1195787 command_runner.go:130] > #   If that is also not specified or set to "", the internal default seccomp profile will be applied.
	I1218 00:37:35.721033 1195787 command_runner.go:130] > #
	I1218 00:37:35.721038 1195787 command_runner.go:130] > # Using the seccomp notifier feature:
	I1218 00:37:35.721043 1195787 command_runner.go:130] > #
	I1218 00:37:35.721049 1195787 command_runner.go:130] > # This feature can help you to debug seccomp related issues, for example if
	I1218 00:37:35.721058 1195787 command_runner.go:130] > # blocked syscalls (permission denied errors) have negative impact on the workload.
	I1218 00:37:35.721061 1195787 command_runner.go:130] > #
	I1218 00:37:35.721072 1195787 command_runner.go:130] > # To be able to use this feature, configure a runtime which has the annotation
	I1218 00:37:35.721082 1195787 command_runner.go:130] > # "io.kubernetes.cri-o.seccompNotifierAction" in the allowed_annotations array.
	I1218 00:37:35.721085 1195787 command_runner.go:130] > #
	I1218 00:37:35.721091 1195787 command_runner.go:130] > # It also requires at least runc 1.1.0 or crun 0.19 which support the notifier
	I1218 00:37:35.721097 1195787 command_runner.go:130] > # feature.
	I1218 00:37:35.721100 1195787 command_runner.go:130] > #
	I1218 00:37:35.721106 1195787 command_runner.go:130] > # If everything is setup, CRI-O will modify chosen seccomp profiles for
	I1218 00:37:35.721112 1195787 command_runner.go:130] > # containers if the annotation "io.kubernetes.cri-o.seccompNotifierAction" is
	I1218 00:37:35.721119 1195787 command_runner.go:130] > # set on the Pod sandbox. CRI-O will then get notified if a container is using
	I1218 00:37:35.721125 1195787 command_runner.go:130] > # a blocked syscall and then terminate the workload after a timeout of 5
	I1218 00:37:35.721131 1195787 command_runner.go:130] > # seconds if the value of "io.kubernetes.cri-o.seccompNotifierAction=stop".
	I1218 00:37:35.721141 1195787 command_runner.go:130] > #
	I1218 00:37:35.721147 1195787 command_runner.go:130] > # This also means that multiple syscalls can be captured during that period,
	I1218 00:37:35.721153 1195787 command_runner.go:130] > # while the timeout will get reset once a new syscall has been discovered.
	I1218 00:37:35.721158 1195787 command_runner.go:130] > #
	I1218 00:37:35.721164 1195787 command_runner.go:130] > # This also means that the Pods "restartPolicy" has to be set to "Never",
	I1218 00:37:35.721170 1195787 command_runner.go:130] > # otherwise the kubelet will restart the container immediately.
	I1218 00:37:35.721177 1195787 command_runner.go:130] > #
	I1218 00:37:35.721183 1195787 command_runner.go:130] > # Please be aware that CRI-O is not able to get notified if a syscall gets
	I1218 00:37:35.721188 1195787 command_runner.go:130] > # blocked based on the seccomp defaultAction, which is a general runtime
	I1218 00:37:35.721192 1195787 command_runner.go:130] > # limitation.
	I1218 00:37:35.721196 1195787 command_runner.go:130] > [crio.runtime.runtimes.crun]
	I1218 00:37:35.721200 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/crun"
	I1218 00:37:35.721204 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721215 1195787 command_runner.go:130] > runtime_root = "/run/crun"
	I1218 00:37:35.721228 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721232 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721236 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721241 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721248 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721251 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721255 1195787 command_runner.go:130] > allowed_annotations = [
	I1218 00:37:35.721261 1195787 command_runner.go:130] > 	"io.containers.trace-syscall",
	I1218 00:37:35.721265 1195787 command_runner.go:130] > ]
	I1218 00:37:35.721270 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721274 1195787 command_runner.go:130] > [crio.runtime.runtimes.runc]
	I1218 00:37:35.721279 1195787 command_runner.go:130] > runtime_path = "/usr/libexec/crio/runc"
	I1218 00:37:35.721282 1195787 command_runner.go:130] > runtime_type = ""
	I1218 00:37:35.721289 1195787 command_runner.go:130] > runtime_root = "/run/runc"
	I1218 00:37:35.721293 1195787 command_runner.go:130] > inherit_default_runtime = false
	I1218 00:37:35.721307 1195787 command_runner.go:130] > runtime_config_path = ""
	I1218 00:37:35.721312 1195787 command_runner.go:130] > container_min_memory = ""
	I1218 00:37:35.721316 1195787 command_runner.go:130] > monitor_path = "/usr/libexec/crio/conmon"
	I1218 00:37:35.721320 1195787 command_runner.go:130] > monitor_cgroup = "pod"
	I1218 00:37:35.721325 1195787 command_runner.go:130] > monitor_exec_cgroup = ""
	I1218 00:37:35.721331 1195787 command_runner.go:130] > privileged_without_host_devices = false
	I1218 00:37:35.721339 1195787 command_runner.go:130] > # The workloads table defines ways to customize containers with different resources
	I1218 00:37:35.721347 1195787 command_runner.go:130] > # that work based on annotations, rather than the CRI.
	I1218 00:37:35.721353 1195787 command_runner.go:130] > # Note, the behavior of this table is EXPERIMENTAL and may change at any time.
	I1218 00:37:35.721361 1195787 command_runner.go:130] > # Each workload, has a name, activation_annotation, annotation_prefix and set of resources it supports mutating.
	I1218 00:37:35.721384 1195787 command_runner.go:130] > # The currently supported resources are "cpuperiod" "cpuquota", "cpushares", "cpulimit" and "cpuset". The values for "cpuperiod" and "cpuquota" are denoted in microseconds.
	I1218 00:37:35.721399 1195787 command_runner.go:130] > # The value for "cpulimit" is denoted in millicores, this value is used to calculate the "cpuquota" with the supplied "cpuperiod" or the default "cpuperiod".
	I1218 00:37:35.721406 1195787 command_runner.go:130] > # Note that the "cpulimit" field overrides the "cpuquota" value supplied in this configuration.
	I1218 00:37:35.721417 1195787 command_runner.go:130] > # Each resource can have a default value specified, or be empty.
	I1218 00:37:35.721427 1195787 command_runner.go:130] > # For a container to opt-into this workload, the pod should be configured with the annotation $activation_annotation (key only, value is ignored).
	I1218 00:37:35.721438 1195787 command_runner.go:130] > # To customize per-container, an annotation of the form $annotation_prefix.$resource/$ctrName = "value" can be specified
	I1218 00:37:35.721444 1195787 command_runner.go:130] > # signifying for that resource type to override the default value.
	I1218 00:37:35.721457 1195787 command_runner.go:130] > # If the annotation_prefix is not present, every container in the pod will be given the default values.
	I1218 00:37:35.721461 1195787 command_runner.go:130] > # Example:
	I1218 00:37:35.721466 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type]
	I1218 00:37:35.721472 1195787 command_runner.go:130] > # activation_annotation = "io.crio/workload"
	I1218 00:37:35.721477 1195787 command_runner.go:130] > # annotation_prefix = "io.crio.workload-type"
	I1218 00:37:35.721487 1195787 command_runner.go:130] > # [crio.runtime.workloads.workload-type.resources]
	I1218 00:37:35.721490 1195787 command_runner.go:130] > # cpuset = "0-1"
	I1218 00:37:35.721494 1195787 command_runner.go:130] > # cpushares = "5"
	I1218 00:37:35.721498 1195787 command_runner.go:130] > # cpuquota = "1000"
	I1218 00:37:35.721502 1195787 command_runner.go:130] > # cpuperiod = "100000"
	I1218 00:37:35.721507 1195787 command_runner.go:130] > # cpulimit = "35"
	I1218 00:37:35.721510 1195787 command_runner.go:130] > # Where:
	I1218 00:37:35.721516 1195787 command_runner.go:130] > # The workload name is workload-type.
	I1218 00:37:35.721524 1195787 command_runner.go:130] > # To specify, the pod must have the "io.crio.workload" annotation (this is a precise string match).
	I1218 00:37:35.721529 1195787 command_runner.go:130] > # This workload supports setting cpuset and cpu resources.
	I1218 00:37:35.721535 1195787 command_runner.go:130] > # annotation_prefix is used to customize the different resources.
	I1218 00:37:35.721544 1195787 command_runner.go:130] > # To configure the cpu shares a container gets in the example above, the pod would have to have the following annotation:
	I1218 00:37:35.721552 1195787 command_runner.go:130] > # "io.crio.workload-type/$container_name = {"cpushares": "value"}"
	I1218 00:37:35.721556 1195787 command_runner.go:130] > # hostnetwork_disable_selinux determines whether
	I1218 00:37:35.721563 1195787 command_runner.go:130] > # SELinux should be disabled within a pod when it is running in the host network namespace
	I1218 00:37:35.721568 1195787 command_runner.go:130] > # Default value is set to true
	I1218 00:37:35.721574 1195787 command_runner.go:130] > # hostnetwork_disable_selinux = true
	I1218 00:37:35.721580 1195787 command_runner.go:130] > # disable_hostport_mapping determines whether to enable/disable
	I1218 00:37:35.721588 1195787 command_runner.go:130] > # the container hostport mapping in CRI-O.
	I1218 00:37:35.721592 1195787 command_runner.go:130] > # Default value is set to 'false'
	I1218 00:37:35.721621 1195787 command_runner.go:130] > # disable_hostport_mapping = false
	I1218 00:37:35.721627 1195787 command_runner.go:130] > # timezone To set the timezone for a container in CRI-O.
	I1218 00:37:35.721635 1195787 command_runner.go:130] > # If an empty string is provided, CRI-O retains its default behavior. Use 'Local' to match the timezone of the host machine.
	I1218 00:37:35.721640 1195787 command_runner.go:130] > # timezone = ""
	I1218 00:37:35.721647 1195787 command_runner.go:130] > # The crio.image table contains settings pertaining to the management of OCI images.
	I1218 00:37:35.721650 1195787 command_runner.go:130] > #
	I1218 00:37:35.721656 1195787 command_runner.go:130] > # CRI-O reads its configured registries defaults from the system wide
	I1218 00:37:35.721665 1195787 command_runner.go:130] > # containers-registries.conf(5) located in /etc/containers/registries.conf.
	I1218 00:37:35.721672 1195787 command_runner.go:130] > [crio.image]
	I1218 00:37:35.721679 1195787 command_runner.go:130] > # Default transport for pulling images from a remote container storage.
	I1218 00:37:35.721683 1195787 command_runner.go:130] > # default_transport = "docker://"
	I1218 00:37:35.721689 1195787 command_runner.go:130] > # The path to a file containing credentials necessary for pulling images from
	I1218 00:37:35.721701 1195787 command_runner.go:130] > # secure registries. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721706 1195787 command_runner.go:130] > # global_auth_file = ""
	I1218 00:37:35.721711 1195787 command_runner.go:130] > # The image used to instantiate infra containers.
	I1218 00:37:35.721723 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721728 1195787 command_runner.go:130] > # pause_image = "registry.k8s.io/pause:3.10.1"
	I1218 00:37:35.721738 1195787 command_runner.go:130] > # The path to a file containing credentials specific for pulling the pause_image from
	I1218 00:37:35.721745 1195787 command_runner.go:130] > # above. The file is similar to that of /var/lib/kubelet/config.json
	I1218 00:37:35.721754 1195787 command_runner.go:130] > # This option supports live configuration reload.
	I1218 00:37:35.721758 1195787 command_runner.go:130] > # pause_image_auth_file = ""
	I1218 00:37:35.721764 1195787 command_runner.go:130] > # The command to run to have a container stay in the paused state.
	I1218 00:37:35.721769 1195787 command_runner.go:130] > # When explicitly set to "", it will fallback to the entrypoint and command
	I1218 00:37:35.721776 1195787 command_runner.go:130] > # specified in the pause image. When commented out, it will fallback to the
	I1218 00:37:35.721781 1195787 command_runner.go:130] > # default: "/pause". This option supports live configuration reload.
	I1218 00:37:35.721787 1195787 command_runner.go:130] > # pause_command = "/pause"
	I1218 00:37:35.721793 1195787 command_runner.go:130] > # List of images to be excluded from the kubelet's garbage collection.
	I1218 00:37:35.721799 1195787 command_runner.go:130] > # It allows specifying image names using either exact, glob, or keyword
	I1218 00:37:35.721805 1195787 command_runner.go:130] > # patterns. Exact matches must match the entire name, glob matches can
	I1218 00:37:35.721813 1195787 command_runner.go:130] > # have a wildcard * at the end, and keyword matches can have wildcards
	I1218 00:37:35.721819 1195787 command_runner.go:130] > # on both ends. By default, this list includes the "pause" image if
	I1218 00:37:35.721825 1195787 command_runner.go:130] > # configured by the user, which is used as a placeholder in Kubernetes pods.
	I1218 00:37:35.721831 1195787 command_runner.go:130] > # pinned_images = [
	I1218 00:37:35.721834 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721840 1195787 command_runner.go:130] > # Path to the file which decides what sort of policy we use when deciding
	I1218 00:37:35.721846 1195787 command_runner.go:130] > # whether or not to trust an image that we've pulled. It is not recommended that
	I1218 00:37:35.721853 1195787 command_runner.go:130] > # this option be used, as the default behavior of using the system-wide default
	I1218 00:37:35.721859 1195787 command_runner.go:130] > # policy (i.e., /etc/containers/policy.json) is most often preferred. Please
	I1218 00:37:35.721866 1195787 command_runner.go:130] > # refer to containers-policy.json(5) for more details.
	I1218 00:37:35.721871 1195787 command_runner.go:130] > signature_policy = "/etc/crio/policy.json"
	I1218 00:37:35.721879 1195787 command_runner.go:130] > # Root path for pod namespace-separated signature policies.
	I1218 00:37:35.721892 1195787 command_runner.go:130] > # The final policy to be used on image pull will be <SIGNATURE_POLICY_DIR>/<NAMESPACE>.json.
	I1218 00:37:35.721901 1195787 command_runner.go:130] > # If no pod namespace is being provided on image pull (via the sandbox config),
	I1218 00:37:35.721912 1195787 command_runner.go:130] > # or the concatenated path is non existent, then the signature_policy or system
	I1218 00:37:35.721918 1195787 command_runner.go:130] > # wide policy will be used as fallback. Must be an absolute path.
	I1218 00:37:35.721923 1195787 command_runner.go:130] > # signature_policy_dir = "/etc/crio/policies"
	I1218 00:37:35.721928 1195787 command_runner.go:130] > # List of registries to skip TLS verification for pulling images. Please
	I1218 00:37:35.721935 1195787 command_runner.go:130] > # consider configuring the registries via /etc/containers/registries.conf before
	I1218 00:37:35.721938 1195787 command_runner.go:130] > # changing them here.
	I1218 00:37:35.721944 1195787 command_runner.go:130] > # This option is deprecated. Use registries.conf file instead.
	I1218 00:37:35.721955 1195787 command_runner.go:130] > # insecure_registries = [
	I1218 00:37:35.721957 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.721964 1195787 command_runner.go:130] > # Controls how image volumes are handled. The valid values are mkdir, bind and
	I1218 00:37:35.721969 1195787 command_runner.go:130] > # ignore; the latter will ignore volumes entirely.
	I1218 00:37:35.721977 1195787 command_runner.go:130] > # image_volumes = "mkdir"
	I1218 00:37:35.721983 1195787 command_runner.go:130] > # Temporary directory to use for storing big files
	I1218 00:37:35.721987 1195787 command_runner.go:130] > # big_files_temporary_dir = ""
	I1218 00:37:35.721998 1195787 command_runner.go:130] > # If true, CRI-O will automatically reload the mirror registry when
	I1218 00:37:35.722005 1195787 command_runner.go:130] > # there is an update to the 'registries.conf.d' directory. Default value is set to 'false'.
	I1218 00:37:35.722009 1195787 command_runner.go:130] > # auto_reload_registries = false
	I1218 00:37:35.722015 1195787 command_runner.go:130] > # The timeout for an image pull to make progress until the pull operation
	I1218 00:37:35.722024 1195787 command_runner.go:130] > # gets canceled. This value will be also used for calculating the pull progress interval to pull_progress_timeout / 10.
	I1218 00:37:35.722031 1195787 command_runner.go:130] > # Can be set to 0 to disable the timeout as well as the progress output.
	I1218 00:37:35.722036 1195787 command_runner.go:130] > # pull_progress_timeout = "0s"
	I1218 00:37:35.722048 1195787 command_runner.go:130] > # The mode of short name resolution.
	I1218 00:37:35.722054 1195787 command_runner.go:130] > # The valid values are "enforcing" and "disabled", and the default is "enforcing".
	I1218 00:37:35.722062 1195787 command_runner.go:130] > # If "enforcing", an image pull will fail if a short name is used, but the results are ambiguous.
	I1218 00:37:35.722070 1195787 command_runner.go:130] > # If "disabled", the first result will be chosen.
	I1218 00:37:35.722074 1195787 command_runner.go:130] > # short_name_mode = "enforcing"
	I1218 00:37:35.722081 1195787 command_runner.go:130] > # OCIArtifactMountSupport is whether CRI-O should support OCI artifacts.
	I1218 00:37:35.722087 1195787 command_runner.go:130] > # If set to false, mounting OCI Artifacts will result in an error.
	I1218 00:37:35.722091 1195787 command_runner.go:130] > # oci_artifact_mount_support = true
	I1218 00:37:35.722097 1195787 command_runner.go:130] > # The crio.network table containers settings pertaining to the management of
	I1218 00:37:35.722108 1195787 command_runner.go:130] > # CNI plugins.
	I1218 00:37:35.722117 1195787 command_runner.go:130] > [crio.network]
	I1218 00:37:35.722131 1195787 command_runner.go:130] > # The default CNI network name to be selected. If not set or "", then
	I1218 00:37:35.722136 1195787 command_runner.go:130] > # CRI-O will pick-up the first one found in network_dir.
	I1218 00:37:35.722142 1195787 command_runner.go:130] > # cni_default_network = ""
	I1218 00:37:35.722148 1195787 command_runner.go:130] > # Path to the directory where CNI configuration files are located.
	I1218 00:37:35.722156 1195787 command_runner.go:130] > # network_dir = "/etc/cni/net.d/"
	I1218 00:37:35.722162 1195787 command_runner.go:130] > # Paths to directories where CNI plugin binaries are located.
	I1218 00:37:35.722165 1195787 command_runner.go:130] > # plugin_dirs = [
	I1218 00:37:35.722169 1195787 command_runner.go:130] > # 	"/opt/cni/bin/",
	I1218 00:37:35.722172 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722176 1195787 command_runner.go:130] > # List of included pod metrics.
	I1218 00:37:35.722180 1195787 command_runner.go:130] > # included_pod_metrics = [
	I1218 00:37:35.722182 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722190 1195787 command_runner.go:130] > # A necessary configuration for Prometheus based metrics retrieval
	I1218 00:37:35.722196 1195787 command_runner.go:130] > [crio.metrics]
	I1218 00:37:35.722201 1195787 command_runner.go:130] > # Globally enable or disable metrics support.
	I1218 00:37:35.722205 1195787 command_runner.go:130] > # enable_metrics = false
	I1218 00:37:35.722209 1195787 command_runner.go:130] > # Specify enabled metrics collectors.
	I1218 00:37:35.722215 1195787 command_runner.go:130] > # Per default all metrics are enabled.
	I1218 00:37:35.722222 1195787 command_runner.go:130] > # It is possible, to prefix the metrics with "container_runtime_" and "crio_".
	I1218 00:37:35.722233 1195787 command_runner.go:130] > # For example, the metrics collector "operations" would be treated in the same
	I1218 00:37:35.722239 1195787 command_runner.go:130] > # way as "crio_operations" and "container_runtime_crio_operations".
	I1218 00:37:35.722243 1195787 command_runner.go:130] > # metrics_collectors = [
	I1218 00:37:35.722247 1195787 command_runner.go:130] > # 	"image_pulls_layer_size",
	I1218 00:37:35.722252 1195787 command_runner.go:130] > # 	"containers_events_dropped_total",
	I1218 00:37:35.722256 1195787 command_runner.go:130] > # 	"containers_oom_total",
	I1218 00:37:35.722260 1195787 command_runner.go:130] > # 	"processes_defunct",
	I1218 00:37:35.722266 1195787 command_runner.go:130] > # 	"operations_total",
	I1218 00:37:35.722270 1195787 command_runner.go:130] > # 	"operations_latency_seconds",
	I1218 00:37:35.722275 1195787 command_runner.go:130] > # 	"operations_latency_seconds_total",
	I1218 00:37:35.722279 1195787 command_runner.go:130] > # 	"operations_errors_total",
	I1218 00:37:35.722283 1195787 command_runner.go:130] > # 	"image_pulls_bytes_total",
	I1218 00:37:35.722287 1195787 command_runner.go:130] > # 	"image_pulls_skipped_bytes_total",
	I1218 00:37:35.722295 1195787 command_runner.go:130] > # 	"image_pulls_failure_total",
	I1218 00:37:35.722299 1195787 command_runner.go:130] > # 	"image_pulls_success_total",
	I1218 00:37:35.722312 1195787 command_runner.go:130] > # 	"image_layer_reuse_total",
	I1218 00:37:35.722316 1195787 command_runner.go:130] > # 	"containers_oom_count_total",
	I1218 00:37:35.722321 1195787 command_runner.go:130] > # 	"containers_seccomp_notifier_count_total",
	I1218 00:37:35.722325 1195787 command_runner.go:130] > # 	"resources_stalled_at_stage",
	I1218 00:37:35.722329 1195787 command_runner.go:130] > # 	"containers_stopped_monitor_count",
	I1218 00:37:35.722332 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722338 1195787 command_runner.go:130] > # The IP address or hostname on which the metrics server will listen.
	I1218 00:37:35.722342 1195787 command_runner.go:130] > # metrics_host = "127.0.0.1"
	I1218 00:37:35.722347 1195787 command_runner.go:130] > # The port on which the metrics server will listen.
	I1218 00:37:35.722351 1195787 command_runner.go:130] > # metrics_port = 9090
	I1218 00:37:35.722358 1195787 command_runner.go:130] > # Local socket path to bind the metrics server to
	I1218 00:37:35.722362 1195787 command_runner.go:130] > # metrics_socket = ""
	I1218 00:37:35.722377 1195787 command_runner.go:130] > # The certificate for the secure metrics server.
	I1218 00:37:35.722386 1195787 command_runner.go:130] > # If the certificate is not available on disk, then CRI-O will generate a
	I1218 00:37:35.722398 1195787 command_runner.go:130] > # self-signed one. CRI-O also watches for changes of this path and reloads the
	I1218 00:37:35.722403 1195787 command_runner.go:130] > # certificate on any modification event.
	I1218 00:37:35.722406 1195787 command_runner.go:130] > # metrics_cert = ""
	I1218 00:37:35.722411 1195787 command_runner.go:130] > # The certificate key for the secure metrics server.
	I1218 00:37:35.722421 1195787 command_runner.go:130] > # Behaves in the same way as the metrics_cert.
	I1218 00:37:35.722424 1195787 command_runner.go:130] > # metrics_key = ""
	I1218 00:37:35.722433 1195787 command_runner.go:130] > # A necessary configuration for OpenTelemetry trace data exporting
	I1218 00:37:35.722437 1195787 command_runner.go:130] > [crio.tracing]
	I1218 00:37:35.722445 1195787 command_runner.go:130] > # Globally enable or disable exporting OpenTelemetry traces.
	I1218 00:37:35.722451 1195787 command_runner.go:130] > # enable_tracing = false
	I1218 00:37:35.722464 1195787 command_runner.go:130] > # Address on which the gRPC trace collector listens on.
	I1218 00:37:35.722472 1195787 command_runner.go:130] > # tracing_endpoint = "127.0.0.1:4317"
	I1218 00:37:35.722479 1195787 command_runner.go:130] > # Number of samples to collect per million spans. Set to 1000000 to always sample.
	I1218 00:37:35.722485 1195787 command_runner.go:130] > # tracing_sampling_rate_per_million = 0
	I1218 00:37:35.722490 1195787 command_runner.go:130] > # CRI-O NRI configuration.
	I1218 00:37:35.722493 1195787 command_runner.go:130] > [crio.nri]
	I1218 00:37:35.722498 1195787 command_runner.go:130] > # Globally enable or disable NRI.
	I1218 00:37:35.722507 1195787 command_runner.go:130] > # enable_nri = true
	I1218 00:37:35.722519 1195787 command_runner.go:130] > # NRI socket to listen on.
	I1218 00:37:35.722524 1195787 command_runner.go:130] > # nri_listen = "/var/run/nri/nri.sock"
	I1218 00:37:35.722528 1195787 command_runner.go:130] > # NRI plugin directory to use.
	I1218 00:37:35.722539 1195787 command_runner.go:130] > # nri_plugin_dir = "/opt/nri/plugins"
	I1218 00:37:35.722544 1195787 command_runner.go:130] > # NRI plugin configuration directory to use.
	I1218 00:37:35.722549 1195787 command_runner.go:130] > # nri_plugin_config_dir = "/etc/nri/conf.d"
	I1218 00:37:35.722557 1195787 command_runner.go:130] > # Disable connections from externally launched NRI plugins.
	I1218 00:37:35.722613 1195787 command_runner.go:130] > # nri_disable_connections = false
	I1218 00:37:35.722623 1195787 command_runner.go:130] > # Timeout for a plugin to register itself with NRI.
	I1218 00:37:35.722628 1195787 command_runner.go:130] > # nri_plugin_registration_timeout = "5s"
	I1218 00:37:35.722634 1195787 command_runner.go:130] > # Timeout for a plugin to handle an NRI request.
	I1218 00:37:35.722640 1195787 command_runner.go:130] > # nri_plugin_request_timeout = "2s"
	I1218 00:37:35.722645 1195787 command_runner.go:130] > # NRI default validator configuration.
	I1218 00:37:35.722651 1195787 command_runner.go:130] > # If enabled, the builtin default validator can be used to reject a container if some
	I1218 00:37:35.722658 1195787 command_runner.go:130] > # NRI plugin requested a restricted adjustment. Currently the following adjustments
	I1218 00:37:35.722663 1195787 command_runner.go:130] > # can be restricted/rejected:
	I1218 00:37:35.722666 1195787 command_runner.go:130] > # - OCI hook injection
	I1218 00:37:35.722671 1195787 command_runner.go:130] > # - adjustment of runtime default seccomp profile
	I1218 00:37:35.722677 1195787 command_runner.go:130] > # - adjustment of unconfied seccomp profile
	I1218 00:37:35.722683 1195787 command_runner.go:130] > # - adjustment of a custom seccomp profile
	I1218 00:37:35.722689 1195787 command_runner.go:130] > # - adjustment of linux namespaces
	I1218 00:37:35.722696 1195787 command_runner.go:130] > # Additionally, the default validator can be used to reject container creation if any
	I1218 00:37:35.722702 1195787 command_runner.go:130] > # of a required set of plugins has not processed a container creation request, unless
	I1218 00:37:35.722709 1195787 command_runner.go:130] > # the container has been annotated to tolerate a missing plugin.
	I1218 00:37:35.722712 1195787 command_runner.go:130] > #
	I1218 00:37:35.722717 1195787 command_runner.go:130] > # [crio.nri.default_validator]
	I1218 00:37:35.722724 1195787 command_runner.go:130] > # nri_enable_default_validator = false
	I1218 00:37:35.722729 1195787 command_runner.go:130] > # nri_validator_reject_oci_hook_adjustment = false
	I1218 00:37:35.722734 1195787 command_runner.go:130] > # nri_validator_reject_runtime_default_seccomp_adjustment = false
	I1218 00:37:35.722739 1195787 command_runner.go:130] > # nri_validator_reject_unconfined_seccomp_adjustment = false
	I1218 00:37:35.722744 1195787 command_runner.go:130] > # nri_validator_reject_custom_seccomp_adjustment = false
	I1218 00:37:35.722749 1195787 command_runner.go:130] > # nri_validator_reject_namespace_adjustment = false
	I1218 00:37:35.722759 1195787 command_runner.go:130] > # nri_validator_required_plugins = [
	I1218 00:37:35.722765 1195787 command_runner.go:130] > # ]
	I1218 00:37:35.722771 1195787 command_runner.go:130] > # nri_validator_tolerate_missing_plugins_annotation = ""
	I1218 00:37:35.722777 1195787 command_runner.go:130] > # Necessary information pertaining to container and pod stats reporting.
	I1218 00:37:35.722788 1195787 command_runner.go:130] > [crio.stats]
	I1218 00:37:35.722797 1195787 command_runner.go:130] > # The number of seconds between collecting pod and container stats.
	I1218 00:37:35.722805 1195787 command_runner.go:130] > # If set to 0, the stats are collected on-demand instead.
	I1218 00:37:35.722809 1195787 command_runner.go:130] > # stats_collection_period = 0
	I1218 00:37:35.722814 1195787 command_runner.go:130] > # The number of seconds between collecting pod/container stats and pod
	I1218 00:37:35.722821 1195787 command_runner.go:130] > # sandbox metrics. If set to 0, the metrics/stats are collected on-demand instead.
	I1218 00:37:35.722825 1195787 command_runner.go:130] > # collection_period = 0
	I1218 00:37:35.722870 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686277403Z" level=info msg="Updating config from single file: /etc/crio/crio.conf"
	I1218 00:37:35.722885 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686455769Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf"
	I1218 00:37:35.722906 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686635242Z" level=info msg="Skipping not-existing config file \"/etc/crio/crio.conf\""
	I1218 00:37:35.722915 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686725939Z" level=info msg="Updating config from path: /etc/crio/crio.conf.d"
	I1218 00:37:35.722930 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.686860827Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:37:35.722940 1195787 command_runner.go:130] ! time="2025-12-18T00:37:35.687143526Z" level=info msg="Updating config from drop-in file: /etc/crio/crio.conf.d/10-crio.conf"
	I1218 00:37:35.722954 1195787 command_runner.go:130] ! level=info msg="Using default capabilities: CAP_CHOWN, CAP_DAC_OVERRIDE, CAP_FSETID, CAP_FOWNER, CAP_SETGID, CAP_SETUID, CAP_SETPCAP, CAP_NET_BIND_SERVICE, CAP_KILL"
	I1218 00:37:35.723070 1195787 cni.go:84] Creating CNI manager for ""
	I1218 00:37:35.723084 1195787 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:37:35.723105 1195787 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:37:35.723135 1195787 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath
:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:37:35.723264 1195787 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:37:35.723342 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:37:35.730799 1195787 command_runner.go:130] > kubeadm
	I1218 00:37:35.730815 1195787 command_runner.go:130] > kubectl
	I1218 00:37:35.730820 1195787 command_runner.go:130] > kubelet
	I1218 00:37:35.730852 1195787 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:37:35.730903 1195787 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:37:35.737892 1195787 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:37:35.749699 1195787 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:37:35.761635 1195787 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2219 bytes)
	I1218 00:37:35.773650 1195787 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:37:35.777155 1195787 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1218 00:37:35.777265 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:35.913809 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:36.641224 1195787 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:37:36.641246 1195787 certs.go:195] generating shared ca certs ...
	I1218 00:37:36.641263 1195787 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:36.641410 1195787 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:37:36.641464 1195787 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:37:36.641475 1195787 certs.go:257] generating profile certs ...
	I1218 00:37:36.641577 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:37:36.641667 1195787 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:37:36.641711 1195787 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:37:36.641724 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1218 00:37:36.641737 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1218 00:37:36.641753 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1218 00:37:36.641763 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1218 00:37:36.641780 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1218 00:37:36.641792 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1218 00:37:36.641807 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1218 00:37:36.641818 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1218 00:37:36.641873 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:37:36.641907 1195787 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:37:36.641920 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:37:36.641952 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:37:36.641982 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:37:36.642014 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:37:36.642068 1195787 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:37:36.642106 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /usr/share/ca-certificates/11595522.pem
	I1218 00:37:36.642122 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.642133 1195787 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem -> /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.642704 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:37:36.662928 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:37:36.685489 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:37:36.708038 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:37:36.726679 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:37:36.744109 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:37:36.760724 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:37:36.777802 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:37:36.794736 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:37:36.811089 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:37:36.827838 1195787 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:37:36.844718 1195787 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:37:36.856626 1195787 ssh_runner.go:195] Run: openssl version
	I1218 00:37:36.862122 1195787 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1218 00:37:36.862595 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.869813 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:37:36.876968 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880287 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880319 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.880364 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:37:36.920445 1195787 command_runner.go:130] > b5213941
	I1218 00:37:36.920887 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:37:36.928015 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.934857 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:37:36.941992 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945456 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945522 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.945583 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:37:36.985712 1195787 command_runner.go:130] > 51391683
	I1218 00:37:36.986191 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:37:36.993294 1195787 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.001803 1195787 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:37:37.011590 1195787 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.016819 1195787 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017267 1195787 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.017348 1195787 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:37:37.061113 1195787 command_runner.go:130] > 3ec20f2e
	I1218 00:37:37.061606 1195787 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:37:37.068668 1195787 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072025 1195787 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:37:37.072050 1195787 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1218 00:37:37.072057 1195787 command_runner.go:130] > Device: 259,1	Inode: 1326178     Links: 1
	I1218 00:37:37.072063 1195787 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1218 00:37:37.072070 1195787 command_runner.go:130] > Access: 2025-12-18 00:33:28.828061434 +0000
	I1218 00:37:37.072075 1195787 command_runner.go:130] > Modify: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072080 1195787 command_runner.go:130] > Change: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072086 1195787 command_runner.go:130] >  Birth: 2025-12-18 00:29:23.775745490 +0000
	I1218 00:37:37.072155 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:37:37.111978 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.112489 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:37:37.152999 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.153074 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:37:37.194884 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.195292 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:37:37.235218 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.235658 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:37:37.275710 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.276177 1195787 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:37:37.316082 1195787 command_runner.go:130] > Certificate will not expire
	I1218 00:37:37.316486 1195787 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwa
rePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:37:37.316593 1195787 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:37:37.316685 1195787 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:37:37.341722 1195787 cri.go:89] found id: ""
	I1218 00:37:37.341828 1195787 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:37:37.348335 1195787 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1218 00:37:37.348357 1195787 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1218 00:37:37.348372 1195787 command_runner.go:130] > /var/lib/minikube/etcd:
	I1218 00:37:37.349183 1195787 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:37:37.349197 1195787 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:37:37.349253 1195787 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:37:37.356307 1195787 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:37:37.356734 1195787 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-288604" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.356836 1195787 kubeconfig.go:62] /home/jenkins/minikube-integration/22186-1156339/kubeconfig needs updating (will repair): [kubeconfig missing "functional-288604" cluster setting kubeconfig missing "functional-288604" context setting]
	I1218 00:37:37.357097 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.357514 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.357675 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.358178 1195787 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1218 00:37:37.358185 1195787 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 00:37:37.358343 1195787 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 00:37:37.358365 1195787 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 00:37:37.358389 1195787 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 00:37:37.358400 1195787 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 00:37:37.358747 1195787 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:37:37.366250 1195787 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1218 00:37:37.366287 1195787 kubeadm.go:602] duration metric: took 17.084351ms to restartPrimaryControlPlane
	I1218 00:37:37.366297 1195787 kubeadm.go:403] duration metric: took 49.819997ms to StartCluster
	I1218 00:37:37.366310 1195787 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.366369 1195787 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.366947 1195787 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:37:37.367145 1195787 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 00:37:37.367532 1195787 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:37:37.367580 1195787 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 00:37:37.367705 1195787 addons.go:70] Setting storage-provisioner=true in profile "functional-288604"
	I1218 00:37:37.367724 1195787 addons.go:239] Setting addon storage-provisioner=true in "functional-288604"
	I1218 00:37:37.367744 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.368436 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.368583 1195787 addons.go:70] Setting default-storageclass=true in profile "functional-288604"
	I1218 00:37:37.368601 1195787 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-288604"
	I1218 00:37:37.368944 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.373199 1195787 out.go:179] * Verifying Kubernetes components...
	I1218 00:37:37.376080 1195787 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:37:37.397822 1195787 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:37:37.397983 1195787 kapi.go:59] client config for functional-288604: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 00:37:37.398246 1195787 addons.go:239] Setting addon default-storageclass=true in "functional-288604"
	I1218 00:37:37.398278 1195787 host.go:66] Checking if "functional-288604" exists ...
	I1218 00:37:37.398894 1195787 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:37:37.407451 1195787 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1218 00:37:37.410300 1195787 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.410322 1195787 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1218 00:37:37.410384 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.434096 1195787 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:37.434117 1195787 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1218 00:37:37.434174 1195787 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:37:37.457842 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.477819 1195787 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:37:37.583963 1195787 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:37:37.618382 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:37.637024 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.392142 1195787 node_ready.go:35] waiting up to 6m0s for node "functional-288604" to be "Ready" ...
	I1218 00:37:38.392289 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.392356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.392602 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392638 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392662 1195787 retry.go:31] will retry after 293.380468ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392710 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.392727 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392733 1195787 retry.go:31] will retry after 283.333163ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.392796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:38.676355 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:38.686660 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:38.750557 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753745 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753775 1195787 retry.go:31] will retry after 508.906429ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753840 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:38.753899 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.753916 1195787 retry.go:31] will retry after 283.918132ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:38.893115 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:38.893199 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:38.893535 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.038817 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.092066 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.095485 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.095518 1195787 retry.go:31] will retry after 317.14343ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.262906 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.318327 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.322166 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.322196 1195787 retry.go:31] will retry after 611.398612ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.392378 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.413200 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:39.474250 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.474288 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.474326 1195787 retry.go:31] will retry after 551.991324ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.892368 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:39.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:39.892757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:39.933930 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:39.991113 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:39.991153 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:39.991172 1195787 retry.go:31] will retry after 590.272449ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.027415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:40.085906 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.089482 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.089515 1195787 retry.go:31] will retry after 1.798316027s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.392931 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.393007 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.393310 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:40.393376 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:40.582668 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:40.643859 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:40.643900 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.643941 1195787 retry.go:31] will retry after 1.196819353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:40.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:40.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:40.892768 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.392495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.841577 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:41.888099 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:41.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:41.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:41.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:41.901267 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.901306 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.901323 1195787 retry.go:31] will retry after 1.106575841s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948402 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:41.948447 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:41.948500 1195787 retry.go:31] will retry after 1.314106681s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:42.393054 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.393195 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.393477 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:42.393524 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:42.893249 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:42.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:42.893594 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.008894 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:43.066157 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.066194 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.066212 1195787 retry.go:31] will retry after 2.952953914s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.263490 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:43.325047 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:43.325147 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.325201 1195787 retry.go:31] will retry after 2.165088511s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:43.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:43.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:43.892529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:43.892853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.392698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:44.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:44.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:44.892859 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:44.892927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:45.392615 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.393055 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:45.491313 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:45.548834 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:45.552259 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.552290 1195787 retry.go:31] will retry after 4.009218302s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:45.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:45.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:45.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.020180 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:46.081331 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:46.081373 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.081392 1195787 retry.go:31] will retry after 2.724964309s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:46.392810 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.392886 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.393216 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:46.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:46.893121 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:46.893451 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:46.893527 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:47.392312 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.392379 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.392690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:47.892435 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:47.892508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:47.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:48.806570 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:48.859925 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:48.863450 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.863523 1195787 retry.go:31] will retry after 5.125713123s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:48.892640 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:48.892710 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:48.892972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:49.392419 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.392858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:49.392930 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:49.562244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:49.616549 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:49.619912 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.619976 1195787 retry.go:31] will retry after 7.525324152s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:49.893380 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:49.893476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:49.893792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.392521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:50.892413 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:50.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:50.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.392501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:51.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:51.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:51.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:51.892896 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:52.392394 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:52.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:52.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:52.892886 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:53.892492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:53.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:53.990244 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:37:54.052144 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:54.052189 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.052212 1195787 retry.go:31] will retry after 10.028215297s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:54.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:54.392879 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:54.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:54.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:54.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:55.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:55.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:55.892892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:56.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:56.892535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:56.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:56.892936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:57.146448 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:37:57.223873 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:37:57.223911 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.223929 1195787 retry.go:31] will retry after 7.68443688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:37:57.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.392441 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.392757 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:57.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:57.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:57.892896 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:58.892496 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:58.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:58.892902 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:37:58.892976 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:37:59.392373 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.392729 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:37:59.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:37:59.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:37:59.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.392605 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.392823 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.393374 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:00.893180 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:00.893259 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:00.893590 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:00.893634 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:01.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.392775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:01.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:01.892560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:01.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.392629 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.392702 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.393091 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:02.893045 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:02.893120 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:02.893431 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:03.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.393360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.393682 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:03.393752 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:03.892452 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:03.892588 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:03.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.081415 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:04.149098 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.149154 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.149173 1195787 retry.go:31] will retry after 12.181474759s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.392826 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:04.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:04.892706 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:04.908952 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:04.983582 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:04.983679 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:04.983707 1195787 retry.go:31] will retry after 20.674508131s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:05.393152 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.393222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.393548 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:05.892344 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:05.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:05.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:05.892840 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:06.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:06.892466 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:06.892581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:06.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.392796 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.392870 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.393185 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:07.893008 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:07.893099 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:07.893411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:07.893460 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:08.393200 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.393269 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.393580 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:08.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:08.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:08.892790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.392350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.392434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:09.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:09.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:09.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:10.392470 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.392542 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:10.392885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:10.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:10.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:10.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.392567 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.392748 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:11.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:11.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:11.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.392724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:12.892526 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:12.892600 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:12.892927 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:12.892994 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:13.392672 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.393083 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:13.892350 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:13.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:13.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:14.892423 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:14.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:14.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:15.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.392797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:15.392868 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:15.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:15.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:15.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.331590 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:16.385966 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:16.389831 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.389871 1195787 retry.go:31] will retry after 10.81475415s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:16.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.393176 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.393493 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:16.893314 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:16.893409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:16.893794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:17.392528 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.392670 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:17.393070 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:17.892892 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:17.892977 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:17.893319 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.393167 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.393296 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:18.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:18.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:18.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:19.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.392531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.393011 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:19.393093 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:19.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:19.892493 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:19.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.392457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.392540 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:20.892404 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:20.892505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:20.892840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.392336 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.392752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:21.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:21.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:21.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:21.892899 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:22.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.392824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:22.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:22.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:22.892650 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:23.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:23.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:23.892812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:24.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.392459 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.392739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:24.392822 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:24.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:24.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.392505 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.392578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.392919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:25.658449 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:25.718689 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:25.718787 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.718811 1195787 retry.go:31] will retry after 20.411460434s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:25.893032 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:25.893152 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:25.893496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:26.393268 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.393345 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.393658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:26.393735 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:26.892424 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:26.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:26.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.205308 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:27.264744 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:27.264795 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.264821 1195787 retry.go:31] will retry after 26.872581906s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:27.393247 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.393343 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.393691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:27.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:27.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:27.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.392400 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.392499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:28.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:28.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:28.892880 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:28.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:29.392435 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.392530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:29.892518 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:29.892615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:29.892959 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.392483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:30.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:30.892684 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:30.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:30.893088 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:31.392443 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.392538 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:31.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:31.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:31.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.392398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:32.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:32.892495 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:32.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:33.392363 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.392786 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:33.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:33.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:33.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:33.892942 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.392843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:34.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:34.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:34.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:35.392430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:35.392919 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:35.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:35.892552 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:35.892909 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:36.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:36.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:36.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:37.392676 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.392747 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.393087 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:37.393163 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:37.892815 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:37.892918 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:37.893191 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.392972 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.393069 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.393395 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:38.893206 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:38.893283 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:38.893614 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.392750 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:39.892457 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:39.892547 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:39.892914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:39.892965 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:40.392478 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:40.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:40.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:40.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.392429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:41.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:41.892557 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:41.892907 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:42.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.392468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.392822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:42.392895 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:42.892798 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:42.893169 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.392965 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.393041 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.393425 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:43.893065 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:43.893192 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:43.893542 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:44.393319 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.393457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.393797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:44.393864 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:44.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:44.892480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:44.892887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.392407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.392691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:45.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:45.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:45.892831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.131350 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:38:46.207148 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:46.207192 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.207211 1195787 retry.go:31] will retry after 46.493082425s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:46.392632 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.393042 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:46.892356 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:46.892439 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:46.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:46.892784 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:47.392561 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.392655 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.393022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:47.892957 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:47.893052 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:47.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.393028 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.393151 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.393502 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:48.893231 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:48.893339 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:48.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:48.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:49.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:49.892410 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:49.892719 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.392567 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.392922 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:50.892639 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:50.892718 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:50.893017 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:51.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:51.392927 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:51.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:51.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:51.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.392587 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.392689 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.393078 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:52.892911 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:52.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:52.893278 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:53.393104 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.393175 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.393506 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:53.393578 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:53.893174 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:53.893253 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:53.893635 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.138097 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:38:54.199604 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:38:54.199639 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.199657 1195787 retry.go:31] will retry after 32.999586692s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1218 00:38:54.392915 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.392997 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.393320 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:54.893151 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:54.893222 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:54.893558 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:55.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.393372 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.393696 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:55.393771 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:55.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:55.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:55.892791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.392482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:56.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:56.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:56.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.392536 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:57.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:57.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:57.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:38:57.892898 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:38:58.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.392510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.392842 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:58.892427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:58.892690 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.392771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:38:59.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:38:59.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:38:59.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:00.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.392852 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:00.392906 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:00.892446 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:00.892521 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:00.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.392554 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.392642 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.392932 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:01.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:01.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:01.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:02.392460 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:02.392937 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:02.893082 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:02.893161 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:02.893517 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.393213 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.393335 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.393710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:03.893298 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:03.893393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:03.893695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.392345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.392774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:04.892322 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:04.892394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:04.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:04.892799 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:05.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:05.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:05.892687 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:05.893007 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.392320 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.392389 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.392734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:06.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:06.892488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:06.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:06.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:07.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.392661 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.393015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:07.892841 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:07.892966 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:07.893308 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.393076 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.393143 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.393465 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:08.893245 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:08.893318 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:08.893642 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:08.893706 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:09.392340 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:09.892426 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:09.892509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:09.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.392603 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.393041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:10.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:10.892396 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:10.892674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:11.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.392452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.392788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:11.392854 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:11.892429 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:11.892513 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:11.892864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.392694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:12.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:12.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:12.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:13.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.392558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:13.392904 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:13.892358 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:13.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:13.892794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.392462 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.392887 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:14.892493 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:14.892568 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:14.892916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.392331 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.392728 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:15.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:15.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:15.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:15.892902 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:16.392592 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.392678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.393051 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:16.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:16.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:16.892766 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.392504 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.392612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.392914 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:17.892926 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:17.893001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:17.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:17.893380 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:18.393186 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.393287 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.393589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:18.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:18.892436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:18.892724 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.392763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:19.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:19.892501 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:19.892879 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:20.392577 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.392677 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:20.393034 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:20.892700 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:20.892780 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:20.893096 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.392884 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.392972 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.393246 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:21.893036 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:21.893115 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:21.893439 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:22.393105 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.393180 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.393531 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:22.393602 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:22.893283 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:22.893357 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:22.893654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.392360 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.392759 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:23.892410 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:23.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:23.892763 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.392324 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.392671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:24.892367 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:24.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:24.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:24.892851 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:25.392405 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.392832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:25.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:25.892428 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:25.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:26.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:26.892598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:26.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:26.893059 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:27.199423 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1218 00:39:27.258871 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262674 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:27.262814 1195787 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:27.393144 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.393338 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.393739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:27.892445 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:27.892515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:27.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.392589 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.392686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.393001 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:28.892388 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:28.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:28.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:29.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:29.392759 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:29.892449 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:29.892575 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:29.892898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.392323 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:30.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:30.892522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:30.892860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:31.392593 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.392698 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.393057 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:31.393175 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:31.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:31.892746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.393076 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:32.700811 1195787 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1218 00:39:32.759422 1195787 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759519 1195787 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1218 00:39:32.759610 1195787 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1218 00:39:32.762569 1195787 out.go:179] * Enabled addons: 
	I1218 00:39:32.766452 1195787 addons.go:530] duration metric: took 1m55.398865574s for enable addons: enabled=[]
	I1218 00:39:32.892720 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:32.892834 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:32.893134 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:33.392885 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.392951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.393266 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:33.393360 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:33.893120 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:33.893193 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:33.893559 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.393379 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.393480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.393839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:34.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:34.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:34.892868 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.392614 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.392707 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.393073 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:35.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:35.892537 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:35.892861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:35.892942 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:36.392383 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.392802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:36.892719 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:36.892802 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:36.893187 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.393130 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.393210 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.393536 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:37.892374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:37.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:37.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:38.392658 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.392729 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:38.393158 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:38.892929 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:38.893002 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:38.893353 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.393129 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.393197 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.393452 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:39.893242 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:39.893317 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:39.893673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.392421 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.392517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:40.892369 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:40.892525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:40.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:40.892938 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:41.392408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.392484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.392798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:41.892430 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:41.892517 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:41.892875 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.392374 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.392446 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.392711 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:42.892629 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:42.892701 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:42.893027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:42.893091 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:43.392773 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.392853 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.393188 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:43.892986 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:43.893071 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:43.893334 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.393112 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.393187 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.393481 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:44.893282 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:44.893350 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:44.893683 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:44.893740 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:45.392327 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.392408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.392700 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:45.892412 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:45.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:45.892795 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.392830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:46.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:46.892449 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:46.892694 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:47.392541 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.392610 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:47.392943 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:47.892913 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:47.892990 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:47.893323 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.392935 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.393001 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.393289 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:48.893086 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:48.893157 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:48.893470 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:49.393330 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.393420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.393740 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:49.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:49.892325 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:49.892391 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:49.892665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.392473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.392861 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:50.892447 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:50.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:50.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.392762 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:51.892406 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:51.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:51.892774 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:51.892823 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:52.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.392553 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:52.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:52.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:52.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.392422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.392736 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:53.892489 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:53.892612 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:53.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:53.893004 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:54.392376 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.392443 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:54.892366 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:54.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:54.892778 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.392479 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.392867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:55.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:55.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:56.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.392808 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:56.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:56.892408 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:56.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:56.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.392473 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.392860 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:57.892694 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:57.892772 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:57.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.392496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:58.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:58.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:58.892739 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:39:58.892789 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:39:59.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.392532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:39:59.892560 1195787 type.go:168] "Request Body" body=""
	I1218 00:39:59.892633 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:39:59.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.392899 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:00.892419 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:00.892832 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:00.892885 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:01.392424 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:01.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:01.892545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:01.892996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.392436 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.392515 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.392804 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:02.892688 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:02.892766 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:02.893081 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:02.893136 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:03.392384 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:03.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:03.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:03.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.392791 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:04.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:04.892671 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:05.392381 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.392455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.392803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:05.392855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:05.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:05.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:05.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.392379 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:06.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:06.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:07.392557 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.392630 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.392948 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:07.393044 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:07.892947 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:07.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:07.893292 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.393116 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.393189 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.393503 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:08.893257 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:08.893330 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:08.893657 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:09.393298 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.393365 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.393623 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:09.393667 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:09.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:09.892429 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:09.892779 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.392530 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.392997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:10.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:10.892422 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:10.892749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:11.892544 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:11.892620 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:11.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:11.893010 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:12.392339 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.392416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.392715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:12.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:12.892483 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:12.892814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:13.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:13.892445 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:13.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:14.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.392813 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:14.392867 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:14.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:14.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:14.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.392448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:15.892562 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:15.892645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:15.892993 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:16.392714 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.392789 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.393108 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:16.393181 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:16.892981 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:16.893057 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:16.893318 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.393280 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.393362 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.393715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:17.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:17.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:17.892960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.392754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:18.892443 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:18.892519 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:18.892891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:18.892951 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:19.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.392715 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.393048 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:19.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:19.892659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.392464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.392845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:20.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:20.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:20.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:21.392346 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:21.392721 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:21.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:21.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:21.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.392452 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.392527 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:22.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:22.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:22.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:23.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:23.392882 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:23.892444 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:23.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:23.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.392684 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:24.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:24.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:24.892811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:25.392518 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.392622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.393004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:25.393058 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:25.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:25.892399 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:25.892660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.392355 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:26.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:26.892589 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:26.892889 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.392982 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:27.892845 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:27.892928 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:27.893247 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:27.893296 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:28.392780 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.392854 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.393198 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:28.892980 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:28.893051 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:28.893305 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.393025 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.393097 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.393406 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:29.893167 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:29.893246 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:29.893569 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:29.893625 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:30.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.392419 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.392749 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:30.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:30.892528 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:30.892844 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.392465 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.392817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:31.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:31.892407 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:31.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:32.392439 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.392882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:32.392936 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:32.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:32.892506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:32.892836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:33.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:33.892484 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:33.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.392470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:34.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:34.892473 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:34.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:34.892850 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:35.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.392811 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:35.892523 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:35.892622 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:35.892978 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.392672 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:36.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:36.892401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:36.892754 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.392604 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:37.393019 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:37.892964 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:37.893061 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:37.893356 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.393199 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.393280 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.393633 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:38.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:38.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:38.892784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.392710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:39.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:39.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:39.892802 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:39.892855 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:40.392417 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.392828 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:40.892328 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:40.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:40.892649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:41.892462 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:41.892534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:41.892867 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:42.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.392664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:42.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:42.892597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:42.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.392651 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.392720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.393047 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:43.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:43.892468 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:43.892851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:44.392388 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.392461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:44.392853 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:44.892484 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:44.892559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:44.892919 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.392436 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.392680 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:45.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:45.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:45.892767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.392422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:46.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:46.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:46.892679 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:46.892729 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:47.392534 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.392616 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.392984 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:47.893009 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:47.893086 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:47.893414 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.393173 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.393243 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.393496 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:48.893310 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:48.893387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:48.893701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:48.893755 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:49.392365 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.392442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.392767 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:49.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:49.892412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:49.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.392433 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.392506 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:50.892416 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:50.892503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:50.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:51.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.392457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.392704 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:51.392745 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:51.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:51.892482 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:51.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.392597 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.392916 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:52.892836 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:52.892908 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:52.893174 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:53.393008 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.393079 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.393392 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:53.393445 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:53.893170 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:53.893250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:53.893577 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.393256 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.393323 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.393624 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:54.892833 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:54.892909 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:54.893230 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.392807 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.392879 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.393195 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:55.892903 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:55.892983 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:55.893248 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:55.893295 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:56.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.393164 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.393494 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:56.893233 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:56.893303 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:56.893625 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.392308 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.392670 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:57.892561 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:57.892640 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:57.893004 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:58.392369 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.392770 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:40:58.392830 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:40:58.892348 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:58.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:58.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.392417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.392726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:40:59.892422 1195787 type.go:168] "Request Body" body=""
	I1218 00:40:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:40:59.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.392375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.392456 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.392732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:00.892425 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:00.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:00.892827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:00.892926 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:01.392600 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.392680 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.393027 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:01.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:01.892461 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:01.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.392459 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.392534 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.392891 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:02.892379 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:02.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:02.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:03.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.392673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:03.392717 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:03.892448 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:03.892520 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:03.892803 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.392800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:04.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:04.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:05.392386 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:05.392862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:05.892501 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:05.892578 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:05.892871 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.392458 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.392846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:06.892564 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:06.892651 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:06.892947 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:07.392849 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.392927 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.393249 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:07.393300 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:07.893061 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:07.893141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:07.893407 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.393571 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:08.893213 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:08.893285 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:08.893586 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:09.393338 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.393409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.393737 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:09.393794 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:09.892310 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:09.892384 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:09.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:10.892460 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:10.892531 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:10.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.392490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.392821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:11.892377 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:11.892452 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:11.892789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:11.892845 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:12.392361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.392447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.392801 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:12.892659 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:12.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:12.893041 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.392441 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.392514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.392855 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:13.892343 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:13.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:13.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:14.392354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.392431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:14.392815 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:14.892477 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:14.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:14.892893 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:15.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:15.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:15.892845 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:16.392432 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.392508 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:16.392886 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:16.892403 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:16.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:16.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.392663 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.392754 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.393122 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:17.893034 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:17.893112 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:17.893434 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:18.393225 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.393298 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.393566 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:18.393619 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:18.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:18.892448 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:18.892776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.392475 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.392554 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.392892 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:19.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:19.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.392463 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.392545 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:20.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:20.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:20.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:20.892875 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:21.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.392460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.392697 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:21.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:21.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:21.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.392426 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:22.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:22.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:22.892691 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:23.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.392835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:23.392897 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:23.892549 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:23.892621 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:23.892930 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.392371 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.392453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:24.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:24.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:24.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:25.392487 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.392571 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.392908 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:25.392969 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:25.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:25.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:25.892701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.392418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.392498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.392829 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:26.892418 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:26.892500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:26.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.392465 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.392539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.392864 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:27.892616 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:27.892694 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:27.892997 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:27.893042 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:28.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.392799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:28.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:28.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:28.892807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.392368 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.392438 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:29.892380 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:29.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:29.892777 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:30.392322 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.392393 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:30.392775 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:30.892378 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:30.892450 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:30.892781 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.392484 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.392563 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.392894 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:31.892335 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:31.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:31.892712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:32.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.392509 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:32.392859 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:32.892645 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:32.892720 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:32.893273 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.393053 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.393128 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.393383 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:33.893197 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:33.893271 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:33.893602 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.392329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.392725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:34.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:34.892409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:34.892714 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:34.892764 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:35.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:35.892454 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:35.892530 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:35.892857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.392364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.392716 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:36.892405 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:36.892486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:36.892818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:36.892870 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:37.392455 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.392562 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.392936 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:37.893011 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:37.893129 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:37.893427 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.393291 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.393628 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:38.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:38.893349 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:38.893718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:38.893795 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:39.393265 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.393337 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.393588 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:39.893331 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:39.893408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:39.893734 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.392756 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:40.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:40.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:40.892710 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:41.392404 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.392475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:41.392842 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:41.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:41.892453 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:41.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.392491 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:42.892665 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:42.892740 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:42.893071 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:43.392639 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.392714 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.393030 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:43.393087 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:43.892352 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:43.892423 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:43.892741 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.392407 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.392479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.392789 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:44.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:44.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:44.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:45.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:45.892576 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:45.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:45.892958 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:46.392429 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.392836 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:46.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:46.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:46.892695 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.392510 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.392586 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.392951 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:47.892396 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:47.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:47.892835 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:48.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.392712 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:48.392768 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:48.892393 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:48.892464 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:48.892771 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.392463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.392787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:49.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:49.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:49.892646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:50.392391 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.392466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.392853 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:50.392909 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:50.892579 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:50.892652 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:50.892985 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.392709 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:51.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:51.892455 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:51.892797 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.392489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.392792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:52.892354 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:52.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:52.892678 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:52.892725 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:53.392414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.392503 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.392831 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:53.892409 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:53.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:53.892806 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:54.892420 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:54.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:54.892854 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:54.892908 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:55.392584 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.392665 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.392996 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:55.892326 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:55.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:55.892800 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.392401 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.392474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:56.892515 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:56.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:56.892944 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:56.892999 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:57.392855 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.392925 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:57.893155 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:57.893229 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:57.893537 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.393357 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.393431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.393713 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:58.892337 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:58.892424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:58.892702 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:41:59.392495 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.392587 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.392917 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:41:59.392973 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:41:59.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:41:59.892497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:41:59.892830 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.392761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:00.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:00.892496 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:00.893052 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:01.392764 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.392860 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.393175 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:01.393227 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:01.892948 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:01.893015 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:01.893270 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.393089 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.393163 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.393444 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:02.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:02.892430 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:02.892742 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.392731 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:03.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:03.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:03.892755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:03.892801 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:04.392448 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.392523 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.392850 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:04.892336 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:04.892418 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:04.892677 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.392387 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:05.892397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:05.892472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:05.892798 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:05.892848 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:06.392347 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.392427 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.392708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:06.892407 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:06.892539 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:06.892929 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.392690 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.392765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.393085 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:07.893075 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:07.893147 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:07.893398 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:07.893442 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:08.393253 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.393325 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.393644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:08.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:08.892405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:08.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.392738 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:09.892421 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:09.892502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:09.892843 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:10.392423 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.392502 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.392838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:10.392894 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:10.892381 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:10.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:10.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.392445 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.392516 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.392851 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:11.892553 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:11.892632 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:11.892973 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.392415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.392666 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:12.892399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:12.892477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:12.892819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:12.892876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:13.392412 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.392492 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.392815 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:13.892351 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:13.892421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:13.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.392377 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.392458 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:14.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:14.892609 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:14.892903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:14.892953 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:15.392344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.392701 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:15.892392 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:15.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:15.892792 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.392442 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.392525 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.392833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:16.892327 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:16.892416 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:16.892667 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:17.392522 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.392590 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:17.392921 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:17.892652 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:17.892728 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:17.893037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.392334 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.392723 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:18.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:18.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:18.892817 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.392399 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.392472 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.392812 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:19.892338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:19.892411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:19.892730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:19.892782 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:20.392466 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.392863 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:20.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:20.892489 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:20.892821 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.392341 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.392413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:21.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:21.892469 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:21.892793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:21.892849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:22.392532 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.392615 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.392957 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:22.892669 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:22.892739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:22.892988 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.392397 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.392497 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:23.892504 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:23.892580 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:23.892912 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:23.892972 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:24.392362 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.392433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.392780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:24.892398 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:24.892471 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:24.892785 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.392469 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.392543 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.392872 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:25.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:25.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:25.892715 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:26.392450 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.392522 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.392876 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:26.392935 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:26.892614 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:26.892686 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:26.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.392919 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.392992 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.393253 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:27.893175 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:27.893249 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:27.893584 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.392338 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.392420 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:28.892361 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:28.892451 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:28.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:28.892862 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:29.392512 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.392870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:29.892578 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:29.892656 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:29.893010 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.392314 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.392392 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.392665 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:30.893263 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:30.893356 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:30.893732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:30.893797 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:31.392476 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.392560 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.392866 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:31.892342 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:31.892431 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:31.892698 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.392396 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.392782 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:32.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:32.892462 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:32.892824 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:33.392366 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.392478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.392818 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:33.392876 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:33.892373 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:33.892447 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:33.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.392507 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.392581 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.392934 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:34.892345 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:34.892413 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:34.892752 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:35.392488 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.392559 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.392873 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:35.392932 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:35.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:35.892504 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:35.892839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.392359 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.392425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.392660 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:36.892395 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:36.892478 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:36.892816 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:37.392521 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.393104 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:37.393155 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:37.892886 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:37.892951 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:37.893194 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.392988 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.393067 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.393390 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:38.893201 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:38.893273 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:38.893589 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:39.393344 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.393415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.393662 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:39.393702 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:39.892383 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:39.892466 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:39.892825 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.392416 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.392807 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:40.892331 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:40.892408 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:40.892764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.392451 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.392529 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.392862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:41.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:41.892494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:41.892870 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:41.892924 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:42.392330 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.392411 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:42.892516 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:42.892611 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:42.892938 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.392638 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.392711 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.393037 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:43.892699 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:43.892765 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:43.893015 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:43.893055 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:44.392565 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.392645 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.392956 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:44.892656 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:44.892731 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:44.893058 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.392581 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.392846 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.393290 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:45.893049 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:45.893123 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:45.893447 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:45.893502 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:46.393297 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.393390 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.393780 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:46.892349 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:46.892433 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:46.892799 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.392560 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.392631 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.392960 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:47.892414 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:47.892485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:47.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:48.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.392401 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.392663 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:48.392703 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:48.892391 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:48.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:48.892846 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.392544 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.392625 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.392943 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:49.892332 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:49.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:49.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:50.392389 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.392794 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:50.392849 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:50.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:50.892479 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:50.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.392402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.392654 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:51.892315 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:51.892388 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:51.892718 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:52.392427 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.392827 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:52.392889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:52.892324 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:52.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:52.892673 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.392487 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:53.892461 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:53.892546 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:53.892874 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.392387 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.392784 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:54.892485 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:54.892558 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:54.892882 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:54.892940 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:55.392636 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.392727 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.393066 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:55.892323 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:55.892398 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:55.892676 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.392410 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.392507 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.392814 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:56.892390 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:56.892499 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:56.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:57.392739 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.392818 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.393074 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:57.393125 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:42:57.892970 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:57.893044 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:57.893385 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.393177 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.393250 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.393560 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:58.893294 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:58.893360 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:58.893621 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.392333 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.392755 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:42:59.892386 1195787 type.go:168] "Request Body" body=""
	I1218 00:42:59.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:42:59.892745 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:42:59.892790 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:00.392349 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.392421 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.392783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:00.892387 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:00.892467 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:00.892805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.392514 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.392584 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.392898 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:01.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:01.892415 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:01.892664 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:02.392403 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.392819 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:02.392871 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:02.892382 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:02.892463 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:02.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.392319 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.392395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.392644 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:03.892329 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:03.892400 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:03.892717 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.392425 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.392500 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.392805 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:04.892330 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:04.892395 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:04.892634 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:04.892673 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:05.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.392477 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.392793 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:05.892394 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:05.892475 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:05.892809 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.392337 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.392406 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.392658 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:06.892401 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:06.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:06.892787 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:06.892843 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:07.392572 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.392646 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.392952 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:07.892874 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:07.892944 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:07.893199 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.393002 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.393080 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.393411 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:08.893209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:08.893286 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:08.893674 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:08.893724 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:09.392325 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.392655 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:09.892370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:09.892460 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:09.892783 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.392431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.392505 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:10.892333 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:10.892402 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:10.892726 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:11.392402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.392480 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.392823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:11.392883 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:11.892528 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:11.892605 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:11.892939 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.392370 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.392444 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.392689 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:12.892525 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:12.892624 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:12.892950 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:13.392523 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.392598 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.392903 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:13.392963 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:13.892519 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:13.892591 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:13.892833 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.392409 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.392849 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:14.892431 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:14.892510 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:14.892823 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.392321 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.392397 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.392685 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:15.892355 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:15.892434 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:15.892732 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:15.892778 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:16.392342 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.392424 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.392746 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:16.893271 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:16.893341 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:16.893646 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.392499 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.392579 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.392911 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:17.892417 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:17.892490 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:17.892760 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:17.892807 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:18.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.392409 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.392730 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:18.892402 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:18.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:18.892788 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.393074 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.393141 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.393442 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:19.893090 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:19.893170 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:19.893422 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:19.893473 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:20.393209 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.393284 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.393611 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:20.893284 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:20.893361 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:20.893687 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.392335 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.392405 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:21.892400 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:21.892476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:21.892822 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:22.392393 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.392476 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.392790 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:22.392844 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:22.892346 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:22.892425 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:22.892708 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.392485 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.392810 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:23.892364 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:23.892440 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:23.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.392326 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.392394 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.392659 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:24.892365 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:24.892498 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:24.892838 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:24.892889 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:25.392413 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.392488 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.392840 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:25.892357 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:25.892457 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:25.892775 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.392472 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.392549 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.392878 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:26.892582 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:26.892678 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:26.893022 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:26.893073 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:27.392724 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.392791 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.393032 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:27.893016 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:27.893101 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:27.893433 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.393249 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.393326 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.393649 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:28.892385 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:28.892474 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:28.892772 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:29.392456 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.392535 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.392869 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:29.392915 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:29.892602 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:29.892673 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:29.893000 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.392343 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.392412 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.392707 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:30.892440 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:30.892514 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:30.892796 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.392486 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.392566 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.392877 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:31.892341 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:31.892417 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:31.892669 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:31.892716 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:32.392390 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.392494 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.392776 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:32.892451 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:32.892524 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:32.892858 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:33.392332 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.392404 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.397972 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1218 00:43:33.892458 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:33.892532 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:33.892862 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:33.892918 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:34.392411 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.392486 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.392857 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:34.892375 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:34.892442 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:34.892725 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.392438 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.392511 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.392839 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:35.892534 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:35.892662 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:35.892986 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:35.893039 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:36.392328 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.392403 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.392764 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:36.892376 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:36.892470 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:36.892761 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.392655 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.392739 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.393068 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1218 00:43:37.892912 1195787 type.go:168] "Request Body" body=""
	I1218 00:43:37.892985 1195787 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-288604" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1218 00:43:37.893251 1195787 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1218 00:43:37.893290 1195787 node_ready.go:55] error getting node "functional-288604" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-288604": dial tcp 192.168.49.2:8441: connect: connection refused
	I1218 00:43:38.393068 1195787 node_ready.go:38] duration metric: took 6m0.000870722s for node "functional-288604" to be "Ready" ...
	I1218 00:43:38.396243 1195787 out.go:203] 
	W1218 00:43:38.399208 1195787 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1218 00:43:38.399223 1195787 out.go:285] * 
	W1218 00:43:38.401353 1195787 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:43:38.404386 1195787 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:43:47 functional-288604 crio[5385]: time="2025-12-18T00:43:47.483682926Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=459e7e34-46b9-4af3-aeb2-3828d78787c0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.544721096Z" level=info msg="Checking image status: minikube-local-cache-test:functional-288604" id=79ae8c57-cb86-4a89-9d9e-488f57f10ba4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.544915906Z" level=info msg="Resolving \"minikube-local-cache-test\" using unqualified-search registries (/etc/containers/registries.conf.d/crio.conf)"
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.54496525Z" level=info msg="Image minikube-local-cache-test:functional-288604 not found" id=79ae8c57-cb86-4a89-9d9e-488f57f10ba4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.545066023Z" level=info msg="Neither image nor artfiact minikube-local-cache-test:functional-288604 found" id=79ae8c57-cb86-4a89-9d9e-488f57f10ba4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.570709928Z" level=info msg="Checking image status: docker.io/library/minikube-local-cache-test:functional-288604" id=447cd5bb-fff0-481e-bb0d-889c82fd5d99 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.570874978Z" level=info msg="Image docker.io/library/minikube-local-cache-test:functional-288604 not found" id=447cd5bb-fff0-481e-bb0d-889c82fd5d99 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.570929262Z" level=info msg="Neither image nor artfiact docker.io/library/minikube-local-cache-test:functional-288604 found" id=447cd5bb-fff0-481e-bb0d-889c82fd5d99 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.598192827Z" level=info msg="Checking image status: localhost/library/minikube-local-cache-test:functional-288604" id=7a1a91e3-6aa3-4392-890d-10a07bdb8a7c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.598368732Z" level=info msg="Image localhost/library/minikube-local-cache-test:functional-288604 not found" id=7a1a91e3-6aa3-4392-890d-10a07bdb8a7c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:48 functional-288604 crio[5385]: time="2025-12-18T00:43:48.598409798Z" level=info msg="Neither image nor artfiact localhost/library/minikube-local-cache-test:functional-288604 found" id=7a1a91e3-6aa3-4392-890d-10a07bdb8a7c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.536834458Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=e9495e62-56fb-476b-9a5c-2d2a6e884dca name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.852180343Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=4ec1c8ce-835c-4c94-b05b-0569237f7941 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.852349881Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=4ec1c8ce-835c-4c94-b05b-0569237f7941 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:49 functional-288604 crio[5385]: time="2025-12-18T00:43:49.852397609Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=4ec1c8ce-835c-4c94-b05b-0569237f7941 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.417235866Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=8af7c980-0705-4445-a0e7-28aef17f7d2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.417374333Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=8af7c980-0705-4445-a0e7-28aef17f7d2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.417420551Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=8af7c980-0705-4445-a0e7-28aef17f7d2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.44140991Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=0ddd8dbb-f8d5-4759-8554-1d72ca750067 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.441570857Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=0ddd8dbb-f8d5-4759-8554-1d72ca750067 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.441617437Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=0ddd8dbb-f8d5-4759-8554-1d72ca750067 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.464727663Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=87e28ff9-65d3-4f82-882a-d2009be58be1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.464890079Z" level=info msg="Image registry.k8s.io/pause:latest not found" id=87e28ff9-65d3-4f82-882a-d2009be58be1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:50 functional-288604 crio[5385]: time="2025-12-18T00:43:50.46494191Z" level=info msg="Neither image nor artfiact registry.k8s.io/pause:latest found" id=87e28ff9-65d3-4f82-882a-d2009be58be1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:43:51 functional-288604 crio[5385]: time="2025-12-18T00:43:51.017019016Z" level=info msg="Checking image status: registry.k8s.io/pause:latest" id=e08e7f89-a2cd-41e3-9d16-4f3e39c6f546 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:43:54.913394    9548 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:54.914048    9548 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:54.915667    9548 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:54.916245    9548 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:43:54.917832    9548 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:43:54 up  7:26,  0 user,  load average: 0.41, 0.25, 0.59
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1157.
	Dec 18 00:43:52 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:52 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:52 functional-288604 kubelet[9424]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:52 functional-288604 kubelet[9424]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:52 functional-288604 kubelet[9424]: E1218 00:43:52.963612    9424 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:52 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:53 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1158.
	Dec 18 00:43:53 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:53 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:53 functional-288604 kubelet[9444]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:53 functional-288604 kubelet[9444]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:53 functional-288604 kubelet[9444]: E1218 00:43:53.692665    9444 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:53 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:53 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:43:54 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1159.
	Dec 18 00:43:54 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:54 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:43:54 functional-288604 kubelet[9464]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:54 functional-288604 kubelet[9464]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:43:54 functional-288604 kubelet[9464]: E1218 00:43:54.453051    9464 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:43:54 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:43:54 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (363.786808ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly (2.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig (735.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-288604 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1218 00:44:32.021061 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:48:19.396415 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:49:32.021023 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:49:42.461966 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:52:35.100368 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:53:19.396829 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:54:32.021532 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-288604 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: exit status 109 (12m12.958458544s)

                                                
                                                
-- stdout --
	* [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	  - apiserver.enable-admission-plugins=NamespaceAutoProvision
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000113635s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:774: failed to restart minikube. args "out/minikube-linux-arm64 start -p functional-288604 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all": exit status 109
functional_test.go:776: restart took 12m12.9597507s for "functional-288604" cluster.
I1218 00:56:08.893803 1159552 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (314.427277ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                      ARGS                                                                       │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-240845 ssh pgrep buildkitd                                                                                                           │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image   │ functional-240845 image ls --format yaml --alsologtostderr                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                          │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format table --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format short --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete  │ -p functional-240845                                                                                                                            │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start   │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ start   │ -p functional-288604 --alsologtostderr -v=8                                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:37 UTC │                     │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.1                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.3                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:latest                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add minikube-local-cache-test:functional-288604                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache delete minikube-local-cache-test:functional-288604                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ list                                                                                                                                            │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl images                                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                              │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ cache   │ functional-288604 cache reload                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ kubectl │ functional-288604 kubectl -- --context functional-288604 get pods                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ start   │ -p functional-288604 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:43:55
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:43:55.978742 1201669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:43:55.978849 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.978853 1201669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:43:55.978857 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.979124 1201669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:43:55.979466 1201669 out.go:368] Setting JSON to false
	I1218 00:43:55.980315 1201669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26784,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:43:55.980372 1201669 start.go:143] virtualization:  
	I1218 00:43:55.983789 1201669 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:43:55.987542 1201669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:43:55.987604 1201669 notify.go:221] Checking for updates...
	I1218 00:43:55.993164 1201669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:43:55.995954 1201669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:43:55.999614 1201669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:43:56.002831 1201669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:43:56.005802 1201669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:43:56.009212 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:56.009315 1201669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:43:56.041210 1201669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:43:56.041338 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.105588 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.095254501 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.105683 1201669 docker.go:319] overlay module found
	I1218 00:43:56.108792 1201669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:43:56.111628 1201669 start.go:309] selected driver: docker
	I1218 00:43:56.111638 1201669 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.111765 1201669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:43:56.111873 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.170180 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.160520969 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.170597 1201669 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:43:56.170621 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:56.170672 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:56.170715 1201669 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.173990 1201669 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:43:56.177055 1201669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:43:56.179992 1201669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:43:56.182847 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:56.182889 1201669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:43:56.182897 1201669 cache.go:65] Caching tarball of preloaded images
	I1218 00:43:56.182969 1201669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:43:56.182979 1201669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:43:56.182988 1201669 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:43:56.183103 1201669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:43:56.202673 1201669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:43:56.202684 1201669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:43:56.202702 1201669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:43:56.202743 1201669 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:43:56.202797 1201669 start.go:364] duration metric: took 37.488µs to acquireMachinesLock for "functional-288604"
	I1218 00:43:56.202818 1201669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:43:56.202823 1201669 fix.go:54] fixHost starting: 
	I1218 00:43:56.203129 1201669 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:43:56.220546 1201669 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:43:56.220565 1201669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:43:56.223742 1201669 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:43:56.223770 1201669 machine.go:94] provisionDockerMachine start ...
	I1218 00:43:56.223861 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.243517 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.243858 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.243865 1201669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:43:56.399607 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.399622 1201669 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:43:56.399683 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.417287 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.417598 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.417605 1201669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:43:56.583098 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.583184 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.603369 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.603669 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.603683 1201669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:43:56.772929 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:43:56.772944 1201669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:43:56.772975 1201669 ubuntu.go:190] setting up certificates
	I1218 00:43:56.772989 1201669 provision.go:84] configureAuth start
	I1218 00:43:56.773070 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:56.789980 1201669 provision.go:143] copyHostCerts
	I1218 00:43:56.790044 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:43:56.790056 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:43:56.790131 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:43:56.790231 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:43:56.790235 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:43:56.790260 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:43:56.790310 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:43:56.790313 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:43:56.790335 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:43:56.790376 1201669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:43:56.986120 1201669 provision.go:177] copyRemoteCerts
	I1218 00:43:56.986182 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:43:56.986224 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.010906 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.115839 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:43:57.132835 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:43:57.150663 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:43:57.167535 1201669 provision.go:87] duration metric: took 394.523589ms to configureAuth
	I1218 00:43:57.167552 1201669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:43:57.167745 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:57.167846 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.184649 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:57.184955 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:57.184966 1201669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:43:57.547661 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:43:57.547677 1201669 machine.go:97] duration metric: took 1.323900056s to provisionDockerMachine
	I1218 00:43:57.547689 1201669 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:43:57.547701 1201669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:43:57.547767 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:43:57.547816 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.568532 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.675839 1201669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:43:57.679095 1201669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:43:57.679112 1201669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:43:57.679121 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:43:57.679176 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:43:57.679251 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:43:57.679324 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:43:57.679367 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:43:57.686719 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:57.703522 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:43:57.720871 1201669 start.go:296] duration metric: took 173.166293ms for postStartSetup
	I1218 00:43:57.720943 1201669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:43:57.720983 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.737854 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.841489 1201669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:43:57.846519 1201669 fix.go:56] duration metric: took 1.643688341s for fixHost
	I1218 00:43:57.846534 1201669 start.go:83] releasing machines lock for "functional-288604", held for 1.6437309s
	I1218 00:43:57.846614 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:57.862813 1201669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:43:57.862836 1201669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:43:57.862859 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.862906 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.880942 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.881296 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.984097 1201669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:43:58.077458 1201669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:43:58.117786 1201669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:43:58.128203 1201669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:43:58.128283 1201669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:43:58.137853 1201669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:43:58.137867 1201669 start.go:496] detecting cgroup driver to use...
	I1218 00:43:58.137898 1201669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:43:58.137955 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:43:58.154333 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:43:58.171243 1201669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:43:58.171317 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:43:58.187629 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:43:58.200443 1201669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:43:58.332309 1201669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:43:58.456320 1201669 docker.go:234] disabling docker service ...
	I1218 00:43:58.456386 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:43:58.471261 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:43:58.484090 1201669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:43:58.600872 1201669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:43:58.712059 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:43:58.725312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:43:58.738398 1201669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:43:58.738467 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.746850 1201669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:43:58.746917 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.755273 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.763400 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.771727 1201669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:43:58.779324 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.788210 1201669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.796348 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.804389 1201669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:43:58.811403 1201669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:43:58.818408 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:58.951912 1201669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:43:59.118783 1201669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:43:59.118849 1201669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:43:59.122545 1201669 start.go:564] Will wait 60s for crictl version
	I1218 00:43:59.122604 1201669 ssh_runner.go:195] Run: which crictl
	I1218 00:43:59.126019 1201669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:43:59.148982 1201669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:43:59.149067 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.175940 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.206912 1201669 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:43:59.209698 1201669 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:43:59.225649 1201669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:43:59.232549 1201669 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1218 00:43:59.235431 1201669 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disabl
eOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:43:59.235543 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:59.235614 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.273407 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.273418 1201669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:43:59.273471 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.299275 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.299287 1201669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:43:59.299293 1201669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:43:59.299404 1201669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:43:59.299490 1201669 ssh_runner.go:195] Run: crio config
	I1218 00:43:59.362084 1201669 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1218 00:43:59.362106 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:59.362113 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:59.362126 1201669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:43:59.362149 1201669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOp
ts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:43:59.362277 1201669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:43:59.362352 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:43:59.369805 1201669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:43:59.369864 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:43:59.376968 1201669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:43:59.388765 1201669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:43:59.400454 1201669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2069 bytes)
	I1218 00:43:59.412514 1201669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:43:59.416040 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:59.531606 1201669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:43:59.640794 1201669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:43:59.640805 1201669 certs.go:195] generating shared ca certs ...
	I1218 00:43:59.640830 1201669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:43:59.640959 1201669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:43:59.641001 1201669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:43:59.641007 1201669 certs.go:257] generating profile certs ...
	I1218 00:43:59.641121 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:43:59.641164 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:43:59.641201 1201669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:43:59.641309 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:43:59.641337 1201669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:43:59.641343 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:43:59.641373 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:43:59.641395 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:43:59.641423 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:43:59.641463 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:59.642073 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:43:59.660992 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:43:59.679818 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:43:59.699150 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:43:59.718895 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:43:59.738413 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:43:59.756315 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:43:59.773826 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:43:59.791059 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:43:59.807447 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:43:59.824212 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:43:59.841186 1201669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:43:59.853492 1201669 ssh_runner.go:195] Run: openssl version
	I1218 00:43:59.859998 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.866869 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:43:59.873885 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877278 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877331 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.917714 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:43:59.925047 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.932048 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:43:59.939101 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942813 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942866 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.983421 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:43:59.990593 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:43:59.997725 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:44:00.042943 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059312 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059393 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.179416 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:44:00.199517 1201669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:44:00.211411 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:44:00.299862 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:44:00.347783 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:44:00.400161 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:44:00.445236 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:44:00.505288 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:44:00.548440 1201669 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:44:00.548538 1201669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:44:00.548659 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.576531 1201669 cri.go:89] found id: ""
	I1218 00:44:00.576602 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:44:00.584414 1201669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:44:00.584430 1201669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:44:00.584481 1201669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:44:00.591678 1201669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.592197 1201669 kubeconfig.go:125] found "functional-288604" server: "https://192.168.49.2:8441"
	I1218 00:44:00.593407 1201669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:44:00.601066 1201669 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-18 00:29:23.211763247 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-18 00:43:59.405160305 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1218 00:44:00.601075 1201669 kubeadm.go:1161] stopping kube-system containers ...
	I1218 00:44:00.601085 1201669 cri.go:54] listing CRI containers in root : {State:all Name: Namespaces:[kube-system]}
	I1218 00:44:00.601140 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.626991 1201669 cri.go:89] found id: ""
	I1218 00:44:00.627065 1201669 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1218 00:44:00.640495 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:44:00.648256 1201669 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 18 00:33 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec 18 00:33 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 18 00:33 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 18 00:33 /etc/kubernetes/scheduler.conf
	
	I1218 00:44:00.648311 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:44:00.655772 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:44:00.663347 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.663410 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:44:00.670748 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.677977 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.678031 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.685079 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:44:00.692996 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.693049 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:44:00.700106 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:44:00.707647 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:00.751682 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:01.971643 1201669 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.219916809s)
	I1218 00:44:01.971736 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.213563 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.279593 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.331094 1201669 api_server.go:52] waiting for apiserver process to appear ...
	I1218 00:44:02.331177 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:02.831338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.332205 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.831381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.331525 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.832325 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.331379 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.332243 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.831869 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.331354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.831326 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.331942 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.831354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.331370 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.832255 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.331366 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.831363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.831359 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.331357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.331990 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.831891 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.331340 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.832123 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.331341 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.332060 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.831352 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.331755 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.831466 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.331860 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.831293 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.831369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.331585 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.832126 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.331328 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.831986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.331369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.831627 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.331975 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.831268 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.331992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.831394 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.331896 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.831502 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.331383 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.831706 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.332082 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.831353 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.331380 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.832133 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.331347 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.831351 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.332001 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.831800 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.331774 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.831372 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.332276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.832017 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.331329 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.832065 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.331713 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.831374 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.331600 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.332164 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.831455 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.331933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.831358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.332063 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.831460 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.331554 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.832152 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.331280 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.831272 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.332273 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.832020 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.331662 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.831758 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.331412 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.831371 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.332088 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.831480 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.332490 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.832201 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.331816 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.831276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.331408 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.831739 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.331262 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.831814 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.332083 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.832108 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.331984 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.831507 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.331363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.831505 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.332120 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.831384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.332279 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.831590 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.331361 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.831933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.331338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.332254 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.832148 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.331950 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.831349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.332302 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.832264 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.331912 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.832145 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.331498 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.831848 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.331497 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:02.332289 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:02.332395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:02.358402 1201669 cri.go:89] found id: ""
	I1218 00:45:02.358416 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.358424 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:02.358429 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:02.358493 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:02.386799 1201669 cri.go:89] found id: ""
	I1218 00:45:02.386814 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.386821 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:02.386825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:02.386882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:02.419430 1201669 cri.go:89] found id: ""
	I1218 00:45:02.419445 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.419453 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:02.419460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:02.419560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:02.445313 1201669 cri.go:89] found id: ""
	I1218 00:45:02.445326 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.445333 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:02.445338 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:02.445395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:02.474189 1201669 cri.go:89] found id: ""
	I1218 00:45:02.474203 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.474210 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:02.474215 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:02.474278 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:02.501782 1201669 cri.go:89] found id: ""
	I1218 00:45:02.501796 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.501803 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:02.501808 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:02.501867 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:02.531648 1201669 cri.go:89] found id: ""
	I1218 00:45:02.531662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.531669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:02.531677 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:02.531690 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:02.597077 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:02.597095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:02.612827 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:02.612845 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:02.680833 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:02.680844 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:02.680855 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:02.749861 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:02.749884 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:05.287966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:05.298109 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:05.298171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:05.323714 1201669 cri.go:89] found id: ""
	I1218 00:45:05.323727 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.323733 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:05.323739 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:05.323800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:05.348520 1201669 cri.go:89] found id: ""
	I1218 00:45:05.348534 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.348541 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:05.348546 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:05.348604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:05.373275 1201669 cri.go:89] found id: ""
	I1218 00:45:05.373290 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.373297 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:05.373302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:05.373362 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:05.397833 1201669 cri.go:89] found id: ""
	I1218 00:45:05.397846 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.397853 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:05.397859 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:05.397921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:05.422938 1201669 cri.go:89] found id: ""
	I1218 00:45:05.422952 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.422959 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:05.422964 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:05.423026 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:05.451027 1201669 cri.go:89] found id: ""
	I1218 00:45:05.451041 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.451048 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:05.451053 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:05.451115 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:05.477082 1201669 cri.go:89] found id: ""
	I1218 00:45:05.477096 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.477102 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:05.477110 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:05.477120 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:05.543065 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:05.543083 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:05.558032 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:05.558047 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:05.623058 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:05.623071 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:05.623081 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:05.694967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:05.694987 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.224381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:08.234565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:08.234639 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:08.262640 1201669 cri.go:89] found id: ""
	I1218 00:45:08.262654 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.262661 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:08.262667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:08.262724 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:08.288384 1201669 cri.go:89] found id: ""
	I1218 00:45:08.288397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.288404 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:08.288409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:08.288468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:08.314880 1201669 cri.go:89] found id: ""
	I1218 00:45:08.314893 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.314900 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:08.314911 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:08.314971 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:08.340105 1201669 cri.go:89] found id: ""
	I1218 00:45:08.340119 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.340125 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:08.340131 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:08.340202 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:08.370009 1201669 cri.go:89] found id: ""
	I1218 00:45:08.370023 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.370030 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:08.370035 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:08.370094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:08.394925 1201669 cri.go:89] found id: ""
	I1218 00:45:08.394939 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.394946 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:08.394951 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:08.395013 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:08.419448 1201669 cri.go:89] found id: ""
	I1218 00:45:08.419462 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.419469 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:08.419477 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:08.419487 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:08.493271 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:08.493290 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.521236 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:08.521251 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:08.591011 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:08.591030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:08.605700 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:08.605716 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:08.674615 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.175336 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:11.186731 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:11.186790 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:11.217495 1201669 cri.go:89] found id: ""
	I1218 00:45:11.217510 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.217517 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:11.217522 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:11.217579 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:11.242494 1201669 cri.go:89] found id: ""
	I1218 00:45:11.242506 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.242514 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:11.242519 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:11.242588 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:11.269562 1201669 cri.go:89] found id: ""
	I1218 00:45:11.269576 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.269583 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:11.269588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:11.269646 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:11.296483 1201669 cri.go:89] found id: ""
	I1218 00:45:11.296497 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.296503 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:11.296517 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:11.296573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:11.324023 1201669 cri.go:89] found id: ""
	I1218 00:45:11.324037 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.324044 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:11.324049 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:11.324107 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:11.350813 1201669 cri.go:89] found id: ""
	I1218 00:45:11.350826 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.350833 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:11.350838 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:11.350915 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:11.375508 1201669 cri.go:89] found id: ""
	I1218 00:45:11.375522 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.375529 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:11.375538 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:11.375548 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:11.443170 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:11.443196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:11.458193 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:11.458209 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:11.526119 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.526129 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:11.526139 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:11.598390 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:11.598409 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:14.127470 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:14.140176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:14.140248 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:14.175467 1201669 cri.go:89] found id: ""
	I1218 00:45:14.175481 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.175488 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:14.175493 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:14.175550 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:14.205623 1201669 cri.go:89] found id: ""
	I1218 00:45:14.205637 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.205649 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:14.205655 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:14.205727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:14.232765 1201669 cri.go:89] found id: ""
	I1218 00:45:14.232779 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.232786 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:14.232790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:14.232848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:14.259382 1201669 cri.go:89] found id: ""
	I1218 00:45:14.259396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.259403 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:14.259408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:14.259465 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:14.284118 1201669 cri.go:89] found id: ""
	I1218 00:45:14.284132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.284139 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:14.284144 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:14.284205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:14.308510 1201669 cri.go:89] found id: ""
	I1218 00:45:14.308530 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.308536 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:14.308552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:14.308619 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:14.336798 1201669 cri.go:89] found id: ""
	I1218 00:45:14.336811 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.336819 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:14.336826 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:14.336837 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:14.402054 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:14.402074 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:14.416289 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:14.416306 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:14.480242 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:14.480255 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:14.480265 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:14.549733 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:14.549753 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:17.078515 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:17.088248 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:17.088306 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:17.112977 1201669 cri.go:89] found id: ""
	I1218 00:45:17.112990 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.112998 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:17.113004 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:17.113062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:17.137141 1201669 cri.go:89] found id: ""
	I1218 00:45:17.137154 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.137161 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:17.137167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:17.137223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:17.166013 1201669 cri.go:89] found id: ""
	I1218 00:45:17.166026 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.166033 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:17.166038 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:17.166098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:17.194884 1201669 cri.go:89] found id: ""
	I1218 00:45:17.194906 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.194920 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:17.194925 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:17.194990 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:17.220329 1201669 cri.go:89] found id: ""
	I1218 00:45:17.220342 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.220349 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:17.220354 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:17.220415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:17.248333 1201669 cri.go:89] found id: ""
	I1218 00:45:17.248347 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.248353 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:17.248359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:17.248415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:17.273057 1201669 cri.go:89] found id: ""
	I1218 00:45:17.273074 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.273084 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:17.273093 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:17.273104 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:17.339448 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:17.339467 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:17.354635 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:17.354652 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:17.422682 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:17.422703 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:17.422714 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:17.490930 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:17.490951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.021992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:20.032625 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:20.032687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:20.060698 1201669 cri.go:89] found id: ""
	I1218 00:45:20.060712 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.060719 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:20.060724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:20.060785 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:20.086680 1201669 cri.go:89] found id: ""
	I1218 00:45:20.086694 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.086701 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:20.086706 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:20.086766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:20.112553 1201669 cri.go:89] found id: ""
	I1218 00:45:20.112567 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.112574 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:20.112579 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:20.112642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:20.137056 1201669 cri.go:89] found id: ""
	I1218 00:45:20.137070 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.137077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:20.137082 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:20.137148 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:20.175745 1201669 cri.go:89] found id: ""
	I1218 00:45:20.175758 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.175775 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:20.175780 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:20.175848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:20.205557 1201669 cri.go:89] found id: ""
	I1218 00:45:20.205570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.205578 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:20.205583 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:20.205645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:20.234726 1201669 cri.go:89] found id: ""
	I1218 00:45:20.234739 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.234746 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:20.234754 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:20.234773 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:20.303025 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:20.303044 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.331096 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:20.331118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:20.397831 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:20.397856 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:20.412745 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:20.412761 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:20.480267 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:22.980543 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:22.990690 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:22.990747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:23.016762 1201669 cri.go:89] found id: ""
	I1218 00:45:23.016795 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.016802 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:23.016807 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:23.016868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:23.042294 1201669 cri.go:89] found id: ""
	I1218 00:45:23.042308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.042315 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:23.042320 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:23.042379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:23.071377 1201669 cri.go:89] found id: ""
	I1218 00:45:23.071392 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.071399 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:23.071405 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:23.071463 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:23.096911 1201669 cri.go:89] found id: ""
	I1218 00:45:23.096925 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.096932 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:23.096938 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:23.097002 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:23.123350 1201669 cri.go:89] found id: ""
	I1218 00:45:23.123363 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.123370 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:23.123375 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:23.123455 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:23.156384 1201669 cri.go:89] found id: ""
	I1218 00:45:23.156397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.156404 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:23.156409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:23.156470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:23.198764 1201669 cri.go:89] found id: ""
	I1218 00:45:23.198777 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.198784 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:23.198792 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:23.198802 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:23.276991 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:23.277016 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:23.305929 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:23.305946 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:23.374243 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:23.374263 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:23.389391 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:23.389408 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:23.455741 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:25.956010 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:25.966329 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:25.966402 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:25.992362 1201669 cri.go:89] found id: ""
	I1218 00:45:25.992376 1201669 logs.go:282] 0 containers: []
	W1218 00:45:25.992383 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:25.992388 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:25.992446 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:26.020474 1201669 cri.go:89] found id: ""
	I1218 00:45:26.020487 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.020495 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:26.020500 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:26.020562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:26.053060 1201669 cri.go:89] found id: ""
	I1218 00:45:26.053083 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.053090 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:26.053096 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:26.053168 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:26.080555 1201669 cri.go:89] found id: ""
	I1218 00:45:26.080570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.080577 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:26.080582 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:26.080642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:26.106383 1201669 cri.go:89] found id: ""
	I1218 00:45:26.106396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.106405 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:26.106413 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:26.106472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:26.133033 1201669 cri.go:89] found id: ""
	I1218 00:45:26.133046 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.133053 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:26.133059 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:26.133114 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:26.166644 1201669 cri.go:89] found id: ""
	I1218 00:45:26.166662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.166669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:26.166683 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:26.166693 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:26.249137 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:26.249156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:26.266352 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:26.266372 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:26.337214 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:26.337225 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:26.337235 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:26.407577 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:26.407597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:28.937809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:28.947798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:28.947860 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:28.972642 1201669 cri.go:89] found id: ""
	I1218 00:45:28.972655 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.972662 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:28.972667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:28.972727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:28.997812 1201669 cri.go:89] found id: ""
	I1218 00:45:28.997827 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.997834 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:28.997839 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:28.997897 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:29.025172 1201669 cri.go:89] found id: ""
	I1218 00:45:29.025188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.025195 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:29.025200 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:29.025261 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:29.050129 1201669 cri.go:89] found id: ""
	I1218 00:45:29.050143 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.050151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:29.050156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:29.050216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:29.074056 1201669 cri.go:89] found id: ""
	I1218 00:45:29.074069 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.074076 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:29.074081 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:29.074138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:29.102343 1201669 cri.go:89] found id: ""
	I1218 00:45:29.102356 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.102363 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:29.102369 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:29.102426 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:29.126969 1201669 cri.go:89] found id: ""
	I1218 00:45:29.126982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.126989 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:29.126996 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:29.127007 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:29.201687 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:29.201704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:29.216680 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:29.216696 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:29.290639 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:29.290648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:29.290670 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:29.363990 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:29.364013 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:31.902567 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:31.912532 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:31.912590 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:31.938305 1201669 cri.go:89] found id: ""
	I1218 00:45:31.938319 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.938326 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:31.938331 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:31.938387 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:31.962545 1201669 cri.go:89] found id: ""
	I1218 00:45:31.962558 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.962565 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:31.962570 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:31.962632 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:31.987508 1201669 cri.go:89] found id: ""
	I1218 00:45:31.987521 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.987529 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:31.987534 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:31.987592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:32.014382 1201669 cri.go:89] found id: ""
	I1218 00:45:32.014395 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.014402 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:32.014408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:32.014474 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:32.041186 1201669 cri.go:89] found id: ""
	I1218 00:45:32.041200 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.041207 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:32.041212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:32.041271 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:32.067285 1201669 cri.go:89] found id: ""
	I1218 00:45:32.067308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.067316 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:32.067322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:32.067382 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:32.094234 1201669 cri.go:89] found id: ""
	I1218 00:45:32.094247 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.094254 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:32.094262 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:32.094272 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:32.164781 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:32.164800 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:32.197838 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:32.197854 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:32.268628 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:32.268648 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:32.282984 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:32.283001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:32.352888 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:34.853182 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:34.863312 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:34.863372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:34.887731 1201669 cri.go:89] found id: ""
	I1218 00:45:34.887745 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.887751 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:34.887756 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:34.887813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:34.913433 1201669 cri.go:89] found id: ""
	I1218 00:45:34.913446 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.913453 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:34.913458 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:34.913525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:34.938029 1201669 cri.go:89] found id: ""
	I1218 00:45:34.938043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.938050 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:34.938056 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:34.938125 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:34.963314 1201669 cri.go:89] found id: ""
	I1218 00:45:34.963327 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.963334 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:34.963339 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:34.963395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:34.991684 1201669 cri.go:89] found id: ""
	I1218 00:45:34.991699 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.991706 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:34.991711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:34.991775 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:35.019323 1201669 cri.go:89] found id: ""
	I1218 00:45:35.019338 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.019344 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:35.019350 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:35.019412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:35.044946 1201669 cri.go:89] found id: ""
	I1218 00:45:35.044960 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.044966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:35.044975 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:35.044986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:35.059688 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:35.059704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:35.127679 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:35.127700 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:35.127711 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:35.200793 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:35.200812 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:35.229277 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:35.229293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:37.797709 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:37.807597 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:37.807657 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:37.834367 1201669 cri.go:89] found id: ""
	I1218 00:45:37.834381 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.834399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:37.834404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:37.834466 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:37.862884 1201669 cri.go:89] found id: ""
	I1218 00:45:37.862898 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.862905 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:37.862910 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:37.862967 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:37.887715 1201669 cri.go:89] found id: ""
	I1218 00:45:37.887729 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.887736 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:37.887741 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:37.887800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:37.912412 1201669 cri.go:89] found id: ""
	I1218 00:45:37.912425 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.912432 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:37.912437 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:37.912500 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:37.936195 1201669 cri.go:89] found id: ""
	I1218 00:45:37.936209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.936216 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:37.936250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:37.936308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:37.964630 1201669 cri.go:89] found id: ""
	I1218 00:45:37.964645 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.964658 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:37.964663 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:37.964718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:37.997426 1201669 cri.go:89] found id: ""
	I1218 00:45:37.997439 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.997446 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:37.997454 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:37.997468 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:38.035686 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:38.035710 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:38.103558 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:38.103578 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:38.118520 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:38.118538 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:38.213391 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:38.213399 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:38.213410 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:40.782711 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:40.792421 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:40.792487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:40.816808 1201669 cri.go:89] found id: ""
	I1218 00:45:40.816821 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.816828 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:40.816833 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:40.816889 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:40.842296 1201669 cri.go:89] found id: ""
	I1218 00:45:40.842309 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.842316 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:40.842321 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:40.842381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:40.870550 1201669 cri.go:89] found id: ""
	I1218 00:45:40.870563 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.870570 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:40.870575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:40.870631 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:40.895987 1201669 cri.go:89] found id: ""
	I1218 00:45:40.896000 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.896007 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:40.896012 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:40.896071 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:40.922196 1201669 cri.go:89] found id: ""
	I1218 00:45:40.922209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.922217 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:40.922228 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:40.922287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:40.951012 1201669 cri.go:89] found id: ""
	I1218 00:45:40.951025 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.951032 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:40.951037 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:40.951094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:40.975029 1201669 cri.go:89] found id: ""
	I1218 00:45:40.975043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.975049 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:40.975057 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:40.975068 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:41.038362 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:41.038371 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:41.038383 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:41.106531 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:41.106550 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:41.133380 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:41.133396 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:41.202955 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:41.202974 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.720946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:43.730523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:43.730580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:43.758480 1201669 cri.go:89] found id: ""
	I1218 00:45:43.758494 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.758501 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:43.758506 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:43.758562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:43.782891 1201669 cri.go:89] found id: ""
	I1218 00:45:43.782904 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.782910 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:43.782915 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:43.782969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:43.807881 1201669 cri.go:89] found id: ""
	I1218 00:45:43.807895 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.807901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:43.807906 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:43.807962 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:43.831922 1201669 cri.go:89] found id: ""
	I1218 00:45:43.831934 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.831941 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:43.831946 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:43.832005 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:43.857303 1201669 cri.go:89] found id: ""
	I1218 00:45:43.857316 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.857323 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:43.857328 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:43.857385 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:43.882932 1201669 cri.go:89] found id: ""
	I1218 00:45:43.882945 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.882962 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:43.882967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:43.883034 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:43.910989 1201669 cri.go:89] found id: ""
	I1218 00:45:43.911003 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.911010 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:43.911017 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:43.911027 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:43.976855 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:43.976875 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.992065 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:43.992080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:44.066663 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:44.066673 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:44.066683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:44.136150 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:44.136169 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.674809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:46.685189 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:46.685253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:46.710336 1201669 cri.go:89] found id: ""
	I1218 00:45:46.710350 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.710357 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:46.710362 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:46.710423 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:46.735331 1201669 cri.go:89] found id: ""
	I1218 00:45:46.735344 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.735351 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:46.735356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:46.735412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:46.760113 1201669 cri.go:89] found id: ""
	I1218 00:45:46.760126 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.760133 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:46.760138 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:46.760192 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:46.785212 1201669 cri.go:89] found id: ""
	I1218 00:45:46.785225 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.785231 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:46.785237 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:46.785292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:46.810594 1201669 cri.go:89] found id: ""
	I1218 00:45:46.810607 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.810614 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:46.810619 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:46.810678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:46.835217 1201669 cri.go:89] found id: ""
	I1218 00:45:46.835231 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.835237 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:46.835242 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:46.835300 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:46.859864 1201669 cri.go:89] found id: ""
	I1218 00:45:46.859877 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.859891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:46.859899 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:46.859910 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.887041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:46.887057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:46.953500 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:46.953519 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:46.968086 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:46.968102 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:47.030071 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:47.030081 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:47.030091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.602443 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:49.612708 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:49.612770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:49.638886 1201669 cri.go:89] found id: ""
	I1218 00:45:49.638900 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.638907 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:49.638912 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:49.638969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:49.666118 1201669 cri.go:89] found id: ""
	I1218 00:45:49.666132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.666139 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:49.666145 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:49.666205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:49.695529 1201669 cri.go:89] found id: ""
	I1218 00:45:49.695542 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.695549 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:49.695554 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:49.695609 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:49.718430 1201669 cri.go:89] found id: ""
	I1218 00:45:49.718444 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.718451 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:49.718457 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:49.718514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:49.742944 1201669 cri.go:89] found id: ""
	I1218 00:45:49.742957 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.742964 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:49.742969 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:49.743028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:49.767863 1201669 cri.go:89] found id: ""
	I1218 00:45:49.767876 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.767888 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:49.767894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:49.767949 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:49.792207 1201669 cri.go:89] found id: ""
	I1218 00:45:49.792254 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.792261 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:49.792269 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:49.792279 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:49.806632 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:49.806655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:49.869094 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:49.869105 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:49.869130 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.936480 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:49.936498 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:49.965414 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:49.965430 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.533961 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:52.543970 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:52.544028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:52.569650 1201669 cri.go:89] found id: ""
	I1218 00:45:52.569663 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.569671 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:52.569676 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:52.569735 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:52.593935 1201669 cri.go:89] found id: ""
	I1218 00:45:52.593949 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.593955 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:52.593961 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:52.594019 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:52.618968 1201669 cri.go:89] found id: ""
	I1218 00:45:52.618982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.618989 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:52.618994 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:52.619051 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:52.647696 1201669 cri.go:89] found id: ""
	I1218 00:45:52.647710 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.647717 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:52.647728 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:52.647787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:52.675609 1201669 cri.go:89] found id: ""
	I1218 00:45:52.675622 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.675629 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:52.675634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:52.675690 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:52.701982 1201669 cri.go:89] found id: ""
	I1218 00:45:52.701995 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.702001 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:52.702007 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:52.702064 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:52.727053 1201669 cri.go:89] found id: ""
	I1218 00:45:52.727066 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.727073 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:52.727081 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:52.727091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.793606 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:52.793626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:52.807921 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:52.807938 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:52.871908 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:52.871918 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:52.871942 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:52.939995 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:52.940015 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.467573 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:55.477751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:55.477808 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:55.503215 1201669 cri.go:89] found id: ""
	I1218 00:45:55.503229 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.503235 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:55.503241 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:55.503299 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:55.528321 1201669 cri.go:89] found id: ""
	I1218 00:45:55.528334 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.528341 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:55.528346 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:55.528406 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:55.555566 1201669 cri.go:89] found id: ""
	I1218 00:45:55.555580 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.555586 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:55.555591 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:55.555659 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:55.580858 1201669 cri.go:89] found id: ""
	I1218 00:45:55.580870 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.580877 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:55.580882 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:55.580941 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:55.609703 1201669 cri.go:89] found id: ""
	I1218 00:45:55.609717 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.609724 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:55.609729 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:55.609792 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:55.635271 1201669 cri.go:89] found id: ""
	I1218 00:45:55.635285 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.635301 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:55.635307 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:55.635379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:55.664174 1201669 cri.go:89] found id: ""
	I1218 00:45:55.664188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.664203 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:55.664211 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:55.664247 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:55.678574 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:55.678597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:55.741880 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:55.741890 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:55.741900 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:55.814783 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:55.814804 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.845128 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:55.845151 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.416331 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:58.426299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:58.426355 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:58.457684 1201669 cri.go:89] found id: ""
	I1218 00:45:58.457698 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.457705 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:58.457710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:58.457769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:58.482307 1201669 cri.go:89] found id: ""
	I1218 00:45:58.482320 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.482327 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:58.482332 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:58.482389 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:58.507442 1201669 cri.go:89] found id: ""
	I1218 00:45:58.507454 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.507461 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:58.507466 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:58.507523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:58.536949 1201669 cri.go:89] found id: ""
	I1218 00:45:58.536963 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.536969 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:58.536974 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:58.537030 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:58.565233 1201669 cri.go:89] found id: ""
	I1218 00:45:58.565246 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.565253 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:58.565257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:58.565313 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:58.589568 1201669 cri.go:89] found id: ""
	I1218 00:45:58.589582 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.589589 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:58.589594 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:58.589655 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:58.613117 1201669 cri.go:89] found id: ""
	I1218 00:45:58.613130 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.613137 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:58.613145 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:58.613156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:58.681549 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:58.681572 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:58.709658 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:58.709678 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.778632 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:58.778651 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:58.793209 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:58.793225 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:58.857093 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.358084 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:01.368502 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:01.368561 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:01.399458 1201669 cri.go:89] found id: ""
	I1218 00:46:01.399490 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.399498 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:01.399504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:01.399589 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:01.429331 1201669 cri.go:89] found id: ""
	I1218 00:46:01.429346 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.429353 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:01.429359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:01.429418 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:01.463764 1201669 cri.go:89] found id: ""
	I1218 00:46:01.463777 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.463784 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:01.463792 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:01.463852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:01.490438 1201669 cri.go:89] found id: ""
	I1218 00:46:01.490451 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.490458 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:01.490464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:01.490523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:01.515150 1201669 cri.go:89] found id: ""
	I1218 00:46:01.515163 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.515170 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:01.515176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:01.515238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:01.541480 1201669 cri.go:89] found id: ""
	I1218 00:46:01.541494 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.541501 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:01.541507 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:01.541567 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:01.566788 1201669 cri.go:89] found id: ""
	I1218 00:46:01.566802 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.566809 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:01.566817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:01.566827 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:01.630909 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.630919 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:01.630929 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:01.699339 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:01.699360 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:01.730198 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:01.730213 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:01.798536 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:01.798555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:04.314812 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:04.325258 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:04.325319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:04.350282 1201669 cri.go:89] found id: ""
	I1218 00:46:04.350302 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.350309 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:04.350314 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:04.350374 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:04.375290 1201669 cri.go:89] found id: ""
	I1218 00:46:04.375305 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.375311 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:04.375316 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:04.375381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:04.410898 1201669 cri.go:89] found id: ""
	I1218 00:46:04.410911 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.410918 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:04.410923 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:04.410980 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:04.448128 1201669 cri.go:89] found id: ""
	I1218 00:46:04.448141 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.448151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:04.448156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:04.448214 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:04.478635 1201669 cri.go:89] found id: ""
	I1218 00:46:04.478648 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.478655 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:04.478660 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:04.478718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:04.504261 1201669 cri.go:89] found id: ""
	I1218 00:46:04.504275 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.504282 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:04.504288 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:04.504345 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:04.529823 1201669 cri.go:89] found id: ""
	I1218 00:46:04.529836 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.529843 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:04.529851 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:04.529862 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:04.595056 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:04.595066 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:04.595076 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:04.665580 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:04.665600 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:04.695540 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:04.695555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:04.766700 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:04.766721 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.281438 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:07.291184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:07.291241 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:07.318270 1201669 cri.go:89] found id: ""
	I1218 00:46:07.318283 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.318290 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:07.318295 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:07.318353 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:07.342684 1201669 cri.go:89] found id: ""
	I1218 00:46:07.342697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.342704 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:07.342718 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:07.342777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:07.367159 1201669 cri.go:89] found id: ""
	I1218 00:46:07.367173 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.367180 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:07.367186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:07.367252 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:07.399917 1201669 cri.go:89] found id: ""
	I1218 00:46:07.399942 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.399949 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:07.399954 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:07.400025 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:07.428891 1201669 cri.go:89] found id: ""
	I1218 00:46:07.428904 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.428911 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:07.428918 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:07.428988 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:07.461232 1201669 cri.go:89] found id: ""
	I1218 00:46:07.461244 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.461251 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:07.461257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:07.461319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:07.487577 1201669 cri.go:89] found id: ""
	I1218 00:46:07.487590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.487607 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:07.487616 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:07.487626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:07.554637 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:07.554656 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.570064 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:07.570080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:07.635097 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:07.635107 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:07.635118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:07.706762 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:07.706782 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.235305 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:10.245498 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:10.245568 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:10.275954 1201669 cri.go:89] found id: ""
	I1218 00:46:10.275965 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.275972 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:10.275985 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:10.276042 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:10.301377 1201669 cri.go:89] found id: ""
	I1218 00:46:10.301391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.301397 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:10.301402 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:10.301468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:10.327075 1201669 cri.go:89] found id: ""
	I1218 00:46:10.327089 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.327096 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:10.327101 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:10.327163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:10.355039 1201669 cri.go:89] found id: ""
	I1218 00:46:10.355052 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.355059 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:10.355064 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:10.355126 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:10.380800 1201669 cri.go:89] found id: ""
	I1218 00:46:10.380814 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.380821 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:10.380826 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:10.380883 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:10.420766 1201669 cri.go:89] found id: ""
	I1218 00:46:10.420781 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.420788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:10.420794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:10.420852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:10.450993 1201669 cri.go:89] found id: ""
	I1218 00:46:10.451006 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.451013 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:10.451021 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:10.451031 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:10.469649 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:10.469664 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:10.534853 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:10.534862 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:10.534873 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:10.603061 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:10.603080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.634944 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:10.634961 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.201986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:13.212552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:13.212611 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:13.236454 1201669 cri.go:89] found id: ""
	I1218 00:46:13.236468 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.236475 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:13.236481 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:13.236542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:13.261394 1201669 cri.go:89] found id: ""
	I1218 00:46:13.261408 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.261415 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:13.261420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:13.261479 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:13.286366 1201669 cri.go:89] found id: ""
	I1218 00:46:13.286380 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.286393 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:13.286398 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:13.286457 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:13.311045 1201669 cri.go:89] found id: ""
	I1218 00:46:13.311058 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.311065 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:13.311070 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:13.311132 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:13.336414 1201669 cri.go:89] found id: ""
	I1218 00:46:13.336427 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.336434 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:13.336439 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:13.336503 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:13.366089 1201669 cri.go:89] found id: ""
	I1218 00:46:13.366102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.366109 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:13.366114 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:13.366170 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:13.398167 1201669 cri.go:89] found id: ""
	I1218 00:46:13.398180 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.398187 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:13.398195 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:13.398205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.472148 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:13.472173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:13.487248 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:13.487267 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:13.552950 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:13.552960 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:13.552973 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:13.622039 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:13.622058 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:16.149384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:16.159725 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:16.159786 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:16.186967 1201669 cri.go:89] found id: ""
	I1218 00:46:16.186981 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.186988 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:16.186993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:16.187052 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:16.213347 1201669 cri.go:89] found id: ""
	I1218 00:46:16.213361 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.213368 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:16.213374 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:16.213431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:16.239666 1201669 cri.go:89] found id: ""
	I1218 00:46:16.239679 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.239686 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:16.239692 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:16.239747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:16.264667 1201669 cri.go:89] found id: ""
	I1218 00:46:16.264680 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.264686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:16.264691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:16.264747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:16.290913 1201669 cri.go:89] found id: ""
	I1218 00:46:16.290925 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.290932 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:16.290937 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:16.290995 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:16.318436 1201669 cri.go:89] found id: ""
	I1218 00:46:16.318449 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.318458 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:16.318464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:16.318522 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:16.344303 1201669 cri.go:89] found id: ""
	I1218 00:46:16.344316 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.344323 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:16.344331 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:16.344342 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:16.411796 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:16.411814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:16.427899 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:16.427916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:16.499022 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:16.499032 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:16.499042 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:16.568931 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:16.568951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.102749 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:19.112504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:19.112560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:19.140375 1201669 cri.go:89] found id: ""
	I1218 00:46:19.140389 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.140396 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:19.140401 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:19.140462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:19.170806 1201669 cri.go:89] found id: ""
	I1218 00:46:19.170832 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.170840 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:19.170848 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:19.170930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:19.202879 1201669 cri.go:89] found id: ""
	I1218 00:46:19.202894 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.202901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:19.202907 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:19.202973 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:19.226832 1201669 cri.go:89] found id: ""
	I1218 00:46:19.226844 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.226851 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:19.226856 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:19.226913 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:19.251251 1201669 cri.go:89] found id: ""
	I1218 00:46:19.251264 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.251271 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:19.251277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:19.251334 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:19.275051 1201669 cri.go:89] found id: ""
	I1218 00:46:19.275064 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.275071 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:19.275080 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:19.275138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:19.303255 1201669 cri.go:89] found id: ""
	I1218 00:46:19.303268 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.303291 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:19.303299 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:19.303309 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.332819 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:19.332836 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:19.398262 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:19.398281 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:19.413015 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:19.413030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:19.483412 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:19.483423 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:19.483475 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:22.052118 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:22.062390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:22.062454 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:22.091903 1201669 cri.go:89] found id: ""
	I1218 00:46:22.091917 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.091924 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:22.091930 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:22.091987 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:22.116458 1201669 cri.go:89] found id: ""
	I1218 00:46:22.116471 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.116478 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:22.116483 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:22.116560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:22.142090 1201669 cri.go:89] found id: ""
	I1218 00:46:22.142102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.142109 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:22.142115 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:22.142180 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:22.166148 1201669 cri.go:89] found id: ""
	I1218 00:46:22.166162 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.166169 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:22.166175 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:22.166234 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:22.191864 1201669 cri.go:89] found id: ""
	I1218 00:46:22.191877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.191884 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:22.191890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:22.191953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:22.216176 1201669 cri.go:89] found id: ""
	I1218 00:46:22.216190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.216197 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:22.216202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:22.216283 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:22.240865 1201669 cri.go:89] found id: ""
	I1218 00:46:22.240878 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.240891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:22.240898 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:22.240908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:22.269665 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:22.269688 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:22.334885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:22.334903 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:22.349240 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:22.349256 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:22.424972 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:22.424982 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:22.425001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.004463 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:25.015873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:25.015934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:25.043533 1201669 cri.go:89] found id: ""
	I1218 00:46:25.043547 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.043558 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:25.043563 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:25.043630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:25.070860 1201669 cri.go:89] found id: ""
	I1218 00:46:25.070874 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.070881 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:25.070887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:25.070945 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:25.100326 1201669 cri.go:89] found id: ""
	I1218 00:46:25.100340 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.100349 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:25.100356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:25.100420 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:25.127292 1201669 cri.go:89] found id: ""
	I1218 00:46:25.127306 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.127313 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:25.127318 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:25.127376 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:25.152929 1201669 cri.go:89] found id: ""
	I1218 00:46:25.152943 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.152950 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:25.152955 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:25.153023 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:25.179602 1201669 cri.go:89] found id: ""
	I1218 00:46:25.179622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.179629 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:25.179634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:25.179691 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:25.204777 1201669 cri.go:89] found id: ""
	I1218 00:46:25.204790 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.204797 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:25.204804 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:25.204814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.274359 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:25.274379 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:25.305207 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:25.305224 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:25.375922 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:25.375941 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:25.392181 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:25.392196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:25.470714 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:27.970992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:27.980972 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:27.981029 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:28.007728 1201669 cri.go:89] found id: ""
	I1218 00:46:28.007744 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.007752 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:28.007758 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:28.007821 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:28.038973 1201669 cri.go:89] found id: ""
	I1218 00:46:28.038987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.038995 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:28.039000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:28.039063 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:28.066609 1201669 cri.go:89] found id: ""
	I1218 00:46:28.066622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.066629 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:28.066634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:28.066695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:28.092484 1201669 cri.go:89] found id: ""
	I1218 00:46:28.092498 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.092506 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:28.092512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:28.092583 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:28.119611 1201669 cri.go:89] found id: ""
	I1218 00:46:28.119625 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.119632 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:28.119638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:28.119698 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:28.145154 1201669 cri.go:89] found id: ""
	I1218 00:46:28.145167 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.145175 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:28.145180 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:28.145238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:28.170178 1201669 cri.go:89] found id: ""
	I1218 00:46:28.170191 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.170198 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:28.170206 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:28.170216 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:28.235805 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:28.235824 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:28.250608 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:28.250629 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:28.314678 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:28.314687 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:28.314698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:28.383399 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:28.383420 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:30.924810 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:30.935068 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:30.935128 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:30.960550 1201669 cri.go:89] found id: ""
	I1218 00:46:30.960563 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.960570 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:30.960575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:30.960636 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:30.985705 1201669 cri.go:89] found id: ""
	I1218 00:46:30.985718 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.985725 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:30.985730 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:30.985787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:31.011725 1201669 cri.go:89] found id: ""
	I1218 00:46:31.011739 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.011746 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:31.011751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:31.011813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:31.038735 1201669 cri.go:89] found id: ""
	I1218 00:46:31.038748 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.038755 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:31.038760 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:31.038822 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:31.062623 1201669 cri.go:89] found id: ""
	I1218 00:46:31.062637 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.062645 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:31.062651 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:31.062716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:31.089339 1201669 cri.go:89] found id: ""
	I1218 00:46:31.089353 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.089366 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:31.089372 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:31.089431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:31.119659 1201669 cri.go:89] found id: ""
	I1218 00:46:31.119672 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.119679 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:31.119687 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:31.119698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:31.185677 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:31.185697 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:31.200077 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:31.200092 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:31.263573 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:31.263582 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:31.263593 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:31.331836 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:31.331857 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:33.859870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:33.871250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:33.871309 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:33.899076 1201669 cri.go:89] found id: ""
	I1218 00:46:33.899090 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.899097 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:33.899103 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:33.899163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:33.927937 1201669 cri.go:89] found id: ""
	I1218 00:46:33.927955 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.927961 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:33.927967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:33.928024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:33.954257 1201669 cri.go:89] found id: ""
	I1218 00:46:33.954271 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.954278 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:33.954283 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:33.954339 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:33.978840 1201669 cri.go:89] found id: ""
	I1218 00:46:33.978853 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.978860 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:33.978865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:33.978921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:34.008172 1201669 cri.go:89] found id: ""
	I1218 00:46:34.008186 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.008193 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:34.008198 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:34.008296 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:34.038029 1201669 cri.go:89] found id: ""
	I1218 00:46:34.038043 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.038050 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:34.038057 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:34.038116 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:34.067280 1201669 cri.go:89] found id: ""
	I1218 00:46:34.067294 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.067302 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:34.067311 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:34.067321 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:34.099533 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:34.099549 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:34.165421 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:34.165442 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:34.179966 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:34.179981 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:34.243670 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:34.243681 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:34.243694 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:36.812424 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:36.822427 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:36.822486 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:36.847846 1201669 cri.go:89] found id: ""
	I1218 00:46:36.847859 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.847866 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:36.847872 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:36.847927 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:36.873323 1201669 cri.go:89] found id: ""
	I1218 00:46:36.873337 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.873344 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:36.873349 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:36.873408 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:36.898528 1201669 cri.go:89] found id: ""
	I1218 00:46:36.898541 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.898547 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:36.898553 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:36.898608 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:36.925176 1201669 cri.go:89] found id: ""
	I1218 00:46:36.925190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.925197 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:36.925202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:36.925260 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:36.954449 1201669 cri.go:89] found id: ""
	I1218 00:46:36.954463 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.954469 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:36.954474 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:36.954533 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:36.978226 1201669 cri.go:89] found id: ""
	I1218 00:46:36.978239 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.978246 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:36.978251 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:36.978308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:37.005731 1201669 cri.go:89] found id: ""
	I1218 00:46:37.005747 1201669 logs.go:282] 0 containers: []
	W1218 00:46:37.005755 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:37.005764 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:37.005776 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:37.026584 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:37.026606 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:37.089657 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:37.089672 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:37.089683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:37.161954 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:37.161980 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:37.189136 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:37.189155 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:39.765929 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:39.776452 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:39.776510 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:39.801519 1201669 cri.go:89] found id: ""
	I1218 00:46:39.801532 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.801539 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:39.801544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:39.801604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:39.829201 1201669 cri.go:89] found id: ""
	I1218 00:46:39.829215 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.829222 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:39.829226 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:39.829287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:39.854274 1201669 cri.go:89] found id: ""
	I1218 00:46:39.854287 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.854294 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:39.854299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:39.854357 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:39.879811 1201669 cri.go:89] found id: ""
	I1218 00:46:39.879824 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.879831 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:39.879836 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:39.879893 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:39.912296 1201669 cri.go:89] found id: ""
	I1218 00:46:39.912310 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.912317 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:39.912322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:39.912380 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:39.939288 1201669 cri.go:89] found id: ""
	I1218 00:46:39.939313 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.939321 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:39.939326 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:39.939393 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:39.967012 1201669 cri.go:89] found id: ""
	I1218 00:46:39.967027 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.967034 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:39.967041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:39.967051 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:40.033896 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:40.033919 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:40.052546 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:40.052564 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:40.123489 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:40.123524 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:40.123537 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:40.195140 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:40.195161 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:42.731664 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:42.741511 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:42.741573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:42.765856 1201669 cri.go:89] found id: ""
	I1218 00:46:42.765869 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.765876 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:42.765881 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:42.765947 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:42.790000 1201669 cri.go:89] found id: ""
	I1218 00:46:42.790013 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.790020 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:42.790025 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:42.790080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:42.814497 1201669 cri.go:89] found id: ""
	I1218 00:46:42.814511 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.814518 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:42.814523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:42.814580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:42.839923 1201669 cri.go:89] found id: ""
	I1218 00:46:42.839937 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.839943 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:42.839948 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:42.840009 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:42.866771 1201669 cri.go:89] found id: ""
	I1218 00:46:42.866784 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.866791 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:42.866798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:42.866856 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:42.894391 1201669 cri.go:89] found id: ""
	I1218 00:46:42.894404 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.894411 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:42.894416 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:42.894481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:42.919369 1201669 cri.go:89] found id: ""
	I1218 00:46:42.919391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.919399 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:42.919408 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:42.919419 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:42.934812 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:42.934829 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:42.998153 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:42.998162 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:42.998173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:43.067475 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:43.067494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:43.097319 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:43.097335 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:45.664349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:45.675110 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:45.675171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:45.707428 1201669 cri.go:89] found id: ""
	I1218 00:46:45.707442 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.707449 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:45.707454 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:45.707512 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:45.732673 1201669 cri.go:89] found id: ""
	I1218 00:46:45.732687 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.732694 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:45.732700 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:45.732759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:45.756652 1201669 cri.go:89] found id: ""
	I1218 00:46:45.756666 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.756673 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:45.756679 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:45.756741 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:45.781416 1201669 cri.go:89] found id: ""
	I1218 00:46:45.781430 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.781437 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:45.781442 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:45.781498 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:45.806268 1201669 cri.go:89] found id: ""
	I1218 00:46:45.806281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.806288 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:45.806294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:45.806363 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:45.831015 1201669 cri.go:89] found id: ""
	I1218 00:46:45.831028 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.831035 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:45.831040 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:45.831098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:45.855951 1201669 cri.go:89] found id: ""
	I1218 00:46:45.855964 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.855970 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:45.855978 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:45.855988 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:45.870419 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:45.870436 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:45.934620 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:45.934630 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:45.934641 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:46.007377 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:46.007400 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:46.038285 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:46.038302 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.604685 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:48.614701 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:48.614759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:48.640971 1201669 cri.go:89] found id: ""
	I1218 00:46:48.640984 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.640991 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:48.640997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:48.641055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:48.670241 1201669 cri.go:89] found id: ""
	I1218 00:46:48.670254 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.670261 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:48.670266 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:48.670324 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:48.714267 1201669 cri.go:89] found id: ""
	I1218 00:46:48.714281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.714288 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:48.714294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:48.714359 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:48.743058 1201669 cri.go:89] found id: ""
	I1218 00:46:48.743071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.743077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:48.743083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:48.743146 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:48.768865 1201669 cri.go:89] found id: ""
	I1218 00:46:48.768877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.768885 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:48.768890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:48.768950 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:48.794057 1201669 cri.go:89] found id: ""
	I1218 00:46:48.794071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.794078 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:48.794083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:48.794139 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:48.824069 1201669 cri.go:89] found id: ""
	I1218 00:46:48.824082 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.824090 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:48.824102 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:48.824112 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.893155 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:48.893176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:48.908605 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:48.908621 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:48.974531 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:48.974541 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:48.974551 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:49.047912 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:49.047931 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.578760 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:51.588638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:51.588697 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:51.620629 1201669 cri.go:89] found id: ""
	I1218 00:46:51.620643 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.620649 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:51.620661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:51.620737 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:51.653267 1201669 cri.go:89] found id: ""
	I1218 00:46:51.653281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.653297 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:51.653302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:51.653372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:51.680215 1201669 cri.go:89] found id: ""
	I1218 00:46:51.680250 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.680257 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:51.680263 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:51.680328 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:51.712435 1201669 cri.go:89] found id: ""
	I1218 00:46:51.712448 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.712455 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:51.712460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:51.712525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:51.740973 1201669 cri.go:89] found id: ""
	I1218 00:46:51.740987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.740994 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:51.741000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:51.741057 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:51.765683 1201669 cri.go:89] found id: ""
	I1218 00:46:51.765697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.765704 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:51.765710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:51.765767 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:51.793065 1201669 cri.go:89] found id: ""
	I1218 00:46:51.793080 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.793088 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:51.793095 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:51.793106 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:51.807847 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:51.807863 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:51.870944 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:51.870953 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:51.870964 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:51.939037 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:51.939057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.973517 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:51.973532 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:54.540109 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:54.550150 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:54.550216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:54.579006 1201669 cri.go:89] found id: ""
	I1218 00:46:54.579019 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.579026 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:54.579031 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:54.579088 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:54.609045 1201669 cri.go:89] found id: ""
	I1218 00:46:54.609059 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.609066 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:54.609071 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:54.609130 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:54.640693 1201669 cri.go:89] found id: ""
	I1218 00:46:54.640707 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.640714 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:54.640720 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:54.640777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:54.674577 1201669 cri.go:89] found id: ""
	I1218 00:46:54.674590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.674597 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:54.674603 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:54.674658 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:54.709862 1201669 cri.go:89] found id: ""
	I1218 00:46:54.709875 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.709882 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:54.709887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:54.709946 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:54.735151 1201669 cri.go:89] found id: ""
	I1218 00:46:54.735165 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.735171 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:54.735177 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:54.735237 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:54.762946 1201669 cri.go:89] found id: ""
	I1218 00:46:54.762960 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.762966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:54.762974 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:54.762984 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:54.778250 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:54.778266 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:54.841698 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:54.841707 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:54.841718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:54.909164 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:54.909183 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:54.946219 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:54.946236 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.515189 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:57.525323 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:57.525384 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:57.550694 1201669 cri.go:89] found id: ""
	I1218 00:46:57.550708 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.550716 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:57.550721 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:57.550782 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:57.578567 1201669 cri.go:89] found id: ""
	I1218 00:46:57.578582 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.578590 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:57.578595 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:57.578656 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:57.604092 1201669 cri.go:89] found id: ""
	I1218 00:46:57.604105 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.604112 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:57.604120 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:57.604178 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:57.628719 1201669 cri.go:89] found id: ""
	I1218 00:46:57.628733 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.628739 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:57.628744 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:57.628806 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:57.666872 1201669 cri.go:89] found id: ""
	I1218 00:46:57.666885 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.666892 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:57.666897 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:57.666954 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:57.703636 1201669 cri.go:89] found id: ""
	I1218 00:46:57.703649 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.703656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:57.703661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:57.703721 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:57.729878 1201669 cri.go:89] found id: ""
	I1218 00:46:57.729891 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.729898 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:57.729905 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:57.729916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.793892 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:57.793911 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:57.808664 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:57.808680 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:57.871552 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:57.871570 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:57.871582 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:57.939629 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:57.939649 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:00.470791 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:00.480890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:00.480955 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:00.510265 1201669 cri.go:89] found id: ""
	I1218 00:47:00.510278 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.510285 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:00.510290 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:00.510349 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:00.534908 1201669 cri.go:89] found id: ""
	I1218 00:47:00.534922 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.534929 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:00.534934 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:00.534992 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:00.559619 1201669 cri.go:89] found id: ""
	I1218 00:47:00.559632 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.559639 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:00.559644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:00.559705 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:00.587698 1201669 cri.go:89] found id: ""
	I1218 00:47:00.587711 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.587719 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:00.587724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:00.587781 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:00.611884 1201669 cri.go:89] found id: ""
	I1218 00:47:00.611897 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.611904 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:00.611909 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:00.611974 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:00.640874 1201669 cri.go:89] found id: ""
	I1218 00:47:00.640888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.640895 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:00.640900 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:00.640965 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:00.674185 1201669 cri.go:89] found id: ""
	I1218 00:47:00.674198 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.674205 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:00.674213 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:00.674223 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:00.750327 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:00.750347 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:00.765877 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:00.765899 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:00.831441 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:00.831450 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:00.831462 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:00.899398 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:00.899423 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:03.427398 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:03.437572 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:03.437634 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:03.466927 1201669 cri.go:89] found id: ""
	I1218 00:47:03.466940 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.466948 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:03.466952 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:03.467011 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:03.492647 1201669 cri.go:89] found id: ""
	I1218 00:47:03.492661 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.492668 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:03.492672 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:03.492729 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:03.522689 1201669 cri.go:89] found id: ""
	I1218 00:47:03.522702 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.522709 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:03.522714 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:03.522774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:03.547665 1201669 cri.go:89] found id: ""
	I1218 00:47:03.547679 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.547686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:03.547691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:03.547754 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:03.573125 1201669 cri.go:89] found id: ""
	I1218 00:47:03.573139 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.573146 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:03.573151 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:03.573209 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:03.598799 1201669 cri.go:89] found id: ""
	I1218 00:47:03.598812 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.598819 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:03.598825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:03.598882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:03.622999 1201669 cri.go:89] found id: ""
	I1218 00:47:03.623013 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.623019 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:03.623027 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:03.623037 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:03.697686 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:03.697703 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:03.715817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:03.715833 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:03.782593 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:03.782603 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:03.782616 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:03.850592 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:03.850611 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:06.381230 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:06.390993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:06.391053 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:06.414602 1201669 cri.go:89] found id: ""
	I1218 00:47:06.414616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.414622 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:06.414628 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:06.414684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:06.438729 1201669 cri.go:89] found id: ""
	I1218 00:47:06.438743 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.438750 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:06.438755 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:06.438820 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:06.463196 1201669 cri.go:89] found id: ""
	I1218 00:47:06.463208 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.463215 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:06.463220 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:06.463275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:06.488161 1201669 cri.go:89] found id: ""
	I1218 00:47:06.488174 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.488181 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:06.488186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:06.488275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:06.517546 1201669 cri.go:89] found id: ""
	I1218 00:47:06.517559 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.517566 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:06.517571 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:06.517630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:06.541811 1201669 cri.go:89] found id: ""
	I1218 00:47:06.541825 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.541831 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:06.541837 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:06.541894 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:06.565470 1201669 cri.go:89] found id: ""
	I1218 00:47:06.565483 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.565491 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:06.565501 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:06.565511 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:06.630810 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:06.630828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:06.650036 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:06.650061 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:06.735359 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:06.735369 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:06.735382 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:06.804427 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:06.804447 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.337315 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:09.347711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:09.347770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:09.373796 1201669 cri.go:89] found id: ""
	I1218 00:47:09.373809 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.373817 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:09.373823 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:09.373887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:09.398745 1201669 cri.go:89] found id: ""
	I1218 00:47:09.398759 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.398766 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:09.398783 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:09.398850 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:09.424602 1201669 cri.go:89] found id: ""
	I1218 00:47:09.424616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.424623 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:09.424630 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:09.424687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:09.453853 1201669 cri.go:89] found id: ""
	I1218 00:47:09.453866 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.453873 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:09.453879 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:09.453934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:09.482334 1201669 cri.go:89] found id: ""
	I1218 00:47:09.482348 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.482355 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:09.482360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:09.482415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:09.514905 1201669 cri.go:89] found id: ""
	I1218 00:47:09.514928 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.514935 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:09.514941 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:09.515006 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:09.538866 1201669 cri.go:89] found id: ""
	I1218 00:47:09.538888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.538895 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:09.538903 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:09.538913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:09.553496 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:09.553516 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:09.615452 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:09.615461 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:09.615472 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:09.683616 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:09.683638 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.715893 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:09.715908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.282722 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:12.292327 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:12.292388 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:12.317025 1201669 cri.go:89] found id: ""
	I1218 00:47:12.317039 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.317045 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:12.317050 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:12.317106 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:12.341477 1201669 cri.go:89] found id: ""
	I1218 00:47:12.341490 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.341497 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:12.341501 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:12.341556 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:12.365784 1201669 cri.go:89] found id: ""
	I1218 00:47:12.365798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.365805 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:12.365810 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:12.365870 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:12.394874 1201669 cri.go:89] found id: ""
	I1218 00:47:12.394887 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.394894 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:12.394899 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:12.394958 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:12.419496 1201669 cri.go:89] found id: ""
	I1218 00:47:12.419509 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.419516 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:12.419521 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:12.419577 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:12.444379 1201669 cri.go:89] found id: ""
	I1218 00:47:12.444393 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.444399 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:12.444414 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:12.444470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:12.468918 1201669 cri.go:89] found id: ""
	I1218 00:47:12.468931 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.468939 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:12.468946 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:12.468960 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:12.537486 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:12.537505 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:12.568974 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:12.568990 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.635070 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:12.635089 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:12.652372 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:12.652388 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:12.728630 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:15.228895 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:15.239250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:15.239307 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:15.264983 1201669 cri.go:89] found id: ""
	I1218 00:47:15.264996 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.265003 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:15.265009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:15.265070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:15.293517 1201669 cri.go:89] found id: ""
	I1218 00:47:15.293531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.293537 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:15.293542 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:15.293599 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:15.319218 1201669 cri.go:89] found id: ""
	I1218 00:47:15.319231 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.319238 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:15.319243 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:15.319298 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:15.344396 1201669 cri.go:89] found id: ""
	I1218 00:47:15.344410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.344417 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:15.344422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:15.344481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:15.373243 1201669 cri.go:89] found id: ""
	I1218 00:47:15.373256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.373263 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:15.373268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:15.373329 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:15.397807 1201669 cri.go:89] found id: ""
	I1218 00:47:15.397820 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.397827 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:15.397832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:15.397887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:15.422535 1201669 cri.go:89] found id: ""
	I1218 00:47:15.422549 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.422557 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:15.422564 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:15.422574 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:15.490575 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:15.490595 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:15.521157 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:15.521176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:15.592728 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:15.592747 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:15.607949 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:15.607965 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:15.688565 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.190283 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:18.200009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:18.200073 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:18.224427 1201669 cri.go:89] found id: ""
	I1218 00:47:18.224440 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.224447 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:18.224453 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:18.224514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:18.248627 1201669 cri.go:89] found id: ""
	I1218 00:47:18.248641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.248648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:18.248653 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:18.248711 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:18.275672 1201669 cri.go:89] found id: ""
	I1218 00:47:18.275690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.275703 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:18.275709 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:18.275766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:18.302626 1201669 cri.go:89] found id: ""
	I1218 00:47:18.302640 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.302656 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:18.302661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:18.302716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:18.328772 1201669 cri.go:89] found id: ""
	I1218 00:47:18.328785 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.328792 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:18.328797 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:18.328852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:18.354242 1201669 cri.go:89] found id: ""
	I1218 00:47:18.354256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.354263 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:18.354268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:18.354332 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:18.378135 1201669 cri.go:89] found id: ""
	I1218 00:47:18.378148 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.378157 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:18.378165 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:18.378175 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:18.443885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:18.443904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:18.458116 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:18.458135 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:18.520486 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.520496 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:18.520507 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:18.586967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:18.586986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:21.118235 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:21.128015 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:21.128072 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:21.153715 1201669 cri.go:89] found id: ""
	I1218 00:47:21.153729 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.153736 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:21.153742 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:21.153803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:21.183062 1201669 cri.go:89] found id: ""
	I1218 00:47:21.183075 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.183082 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:21.183087 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:21.183144 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:21.210382 1201669 cri.go:89] found id: ""
	I1218 00:47:21.210396 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.210402 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:21.210407 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:21.210462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:21.235561 1201669 cri.go:89] found id: ""
	I1218 00:47:21.235575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.235582 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:21.235587 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:21.235684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:21.261486 1201669 cri.go:89] found id: ""
	I1218 00:47:21.261500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.261507 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:21.261512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:21.261571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:21.286687 1201669 cri.go:89] found id: ""
	I1218 00:47:21.286701 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.286708 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:21.286713 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:21.286770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:21.312639 1201669 cri.go:89] found id: ""
	I1218 00:47:21.312656 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.312663 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:21.312671 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:21.312682 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:21.377475 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:21.377494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:21.394148 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:21.394166 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:21.461525 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:21.461535 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:21.461546 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:21.529823 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:21.529841 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.060601 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:24.071009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:24.071080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:24.098379 1201669 cri.go:89] found id: ""
	I1218 00:47:24.098392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.098399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:24.098406 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:24.098520 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:24.125402 1201669 cri.go:89] found id: ""
	I1218 00:47:24.125416 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.125423 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:24.125428 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:24.125487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:24.151397 1201669 cri.go:89] found id: ""
	I1218 00:47:24.151410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.151417 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:24.151422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:24.151485 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:24.178459 1201669 cri.go:89] found id: ""
	I1218 00:47:24.178473 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.178480 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:24.178485 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:24.178542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:24.204162 1201669 cri.go:89] found id: ""
	I1218 00:47:24.204175 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.204182 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:24.204188 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:24.204282 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:24.232955 1201669 cri.go:89] found id: ""
	I1218 00:47:24.232969 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.232977 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:24.232982 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:24.233043 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:24.258828 1201669 cri.go:89] found id: ""
	I1218 00:47:24.258841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.258848 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:24.258856 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:24.258867 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.285593 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:24.285609 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:24.352328 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:24.352348 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:24.367078 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:24.367095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:24.430867 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:24.430877 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:24.430887 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.002647 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:27.013860 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:27.013930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:27.042334 1201669 cri.go:89] found id: ""
	I1218 00:47:27.042347 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.042354 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:27.042360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:27.042419 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:27.066697 1201669 cri.go:89] found id: ""
	I1218 00:47:27.066710 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.066717 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:27.066722 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:27.066777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:27.094998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.095011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.095018 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:27.095024 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:27.095081 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:27.122504 1201669 cri.go:89] found id: ""
	I1218 00:47:27.122518 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.122525 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:27.122530 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:27.122587 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:27.147998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.148011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.148018 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:27.148023 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:27.148093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:27.172132 1201669 cri.go:89] found id: ""
	I1218 00:47:27.172149 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.172156 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:27.172161 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:27.172253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:27.197418 1201669 cri.go:89] found id: ""
	I1218 00:47:27.197431 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.197438 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:27.197445 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:27.197455 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:27.263570 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:27.263588 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:27.278312 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:27.278327 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:27.342448 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:27.342458 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:27.342469 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.410881 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:27.410901 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:29.944358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:29.954644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:29.954701 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:29.978633 1201669 cri.go:89] found id: ""
	I1218 00:47:29.978647 1201669 logs.go:282] 0 containers: []
	W1218 00:47:29.978654 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:29.978659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:29.978717 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:30.009832 1201669 cri.go:89] found id: ""
	I1218 00:47:30.009850 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.009858 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:30.009864 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:30.009938 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:30.040840 1201669 cri.go:89] found id: ""
	I1218 00:47:30.040858 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.040867 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:30.040876 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:30.040952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:30.068318 1201669 cri.go:89] found id: ""
	I1218 00:47:30.068332 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.068339 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:30.068344 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:30.068407 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:30.094562 1201669 cri.go:89] found id: ""
	I1218 00:47:30.094577 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.094584 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:30.094589 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:30.094650 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:30.121388 1201669 cri.go:89] found id: ""
	I1218 00:47:30.121402 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.121409 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:30.121415 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:30.121472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:30.149519 1201669 cri.go:89] found id: ""
	I1218 00:47:30.149533 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.149540 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:30.149550 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:30.149565 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:30.177089 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:30.177107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:30.242748 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:30.242767 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:30.257468 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:30.257483 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:30.320728 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:30.320738 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:30.320749 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:32.889870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:32.900811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:32.900868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:32.927540 1201669 cri.go:89] found id: ""
	I1218 00:47:32.927553 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.927560 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:32.927565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:32.927622 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:32.955598 1201669 cri.go:89] found id: ""
	I1218 00:47:32.955611 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.955619 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:32.955623 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:32.955695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:32.979141 1201669 cri.go:89] found id: ""
	I1218 00:47:32.979155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.979162 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:32.979167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:32.979224 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:33.006203 1201669 cri.go:89] found id: ""
	I1218 00:47:33.006218 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.006225 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:33.006230 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:33.006294 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:33.034661 1201669 cri.go:89] found id: ""
	I1218 00:47:33.034675 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.034691 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:33.034697 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:33.034756 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:33.062772 1201669 cri.go:89] found id: ""
	I1218 00:47:33.062786 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.062793 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:33.062804 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:33.062869 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:33.086825 1201669 cri.go:89] found id: ""
	I1218 00:47:33.086839 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.086846 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:33.086871 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:33.086881 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:33.156565 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:33.156585 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:33.185756 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:33.185772 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:33.256648 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:33.256666 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:33.271243 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:33.271259 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:33.337446 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:35.839102 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:35.850275 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:35.850343 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:35.875276 1201669 cri.go:89] found id: ""
	I1218 00:47:35.875289 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.875296 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:35.875301 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:35.875361 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:35.912387 1201669 cri.go:89] found id: ""
	I1218 00:47:35.912400 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.912407 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:35.912412 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:35.912471 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:35.942361 1201669 cri.go:89] found id: ""
	I1218 00:47:35.942379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.942394 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:35.942400 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:35.942499 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:35.972562 1201669 cri.go:89] found id: ""
	I1218 00:47:35.972575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.972584 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:35.972588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:35.972644 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:35.998846 1201669 cri.go:89] found id: ""
	I1218 00:47:35.998861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.998868 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:35.998874 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:35.998952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:36.030184 1201669 cri.go:89] found id: ""
	I1218 00:47:36.030197 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.030213 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:36.030219 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:36.030292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:36.055609 1201669 cri.go:89] found id: ""
	I1218 00:47:36.055624 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.055640 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:36.055648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:36.055658 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:36.128355 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:36.128374 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:36.159887 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:36.159904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:36.229693 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:36.229712 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:36.244397 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:36.244412 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:36.308352 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:38.808637 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:38.819085 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:38.819152 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:38.844745 1201669 cri.go:89] found id: ""
	I1218 00:47:38.844758 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.844766 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:38.844771 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:38.844827 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:38.869442 1201669 cri.go:89] found id: ""
	I1218 00:47:38.869456 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.869463 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:38.869469 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:38.869531 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:38.899129 1201669 cri.go:89] found id: ""
	I1218 00:47:38.899151 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.899158 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:38.899163 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:38.899232 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:38.929158 1201669 cri.go:89] found id: ""
	I1218 00:47:38.929171 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.929178 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:38.929184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:38.929250 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:38.959987 1201669 cri.go:89] found id: ""
	I1218 00:47:38.960016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.960023 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:38.960029 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:38.960093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:38.987077 1201669 cri.go:89] found id: ""
	I1218 00:47:38.987091 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.987098 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:38.987104 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:38.987160 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:39.015227 1201669 cri.go:89] found id: ""
	I1218 00:47:39.015240 1201669 logs.go:282] 0 containers: []
	W1218 00:47:39.015257 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:39.015266 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:39.015278 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:39.044299 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:39.044322 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:39.110657 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:39.110677 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:39.127155 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:39.127171 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:39.195223 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:39.195233 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:39.195243 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:41.762478 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:41.772539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:41.772600 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:41.797940 1201669 cri.go:89] found id: ""
	I1218 00:47:41.797954 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.797961 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:41.797967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:41.798024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:41.823230 1201669 cri.go:89] found id: ""
	I1218 00:47:41.823244 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.823251 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:41.823256 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:41.823314 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:41.848348 1201669 cri.go:89] found id: ""
	I1218 00:47:41.848368 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.848384 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:41.848390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:41.848447 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:41.873186 1201669 cri.go:89] found id: ""
	I1218 00:47:41.873199 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.873207 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:41.873212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:41.873269 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:41.910240 1201669 cri.go:89] found id: ""
	I1218 00:47:41.910253 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.910260 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:41.910265 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:41.910323 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:41.938636 1201669 cri.go:89] found id: ""
	I1218 00:47:41.938649 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.938656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:41.938661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:41.938723 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:41.967011 1201669 cri.go:89] found id: ""
	I1218 00:47:41.967024 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.967031 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:41.967039 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:41.967048 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:42.032273 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:42.032293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:42.047961 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:42.047977 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:42.129763 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:42.129777 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:42.129788 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:42.203638 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:42.203661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:44.747018 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:44.757561 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:44.757666 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:44.782848 1201669 cri.go:89] found id: ""
	I1218 00:47:44.782861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.782868 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:44.782873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:44.782930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:44.812029 1201669 cri.go:89] found id: ""
	I1218 00:47:44.812042 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.812049 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:44.812054 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:44.812111 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:44.835973 1201669 cri.go:89] found id: ""
	I1218 00:47:44.835986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.835994 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:44.835998 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:44.836055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:44.865506 1201669 cri.go:89] found id: ""
	I1218 00:47:44.865524 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.865532 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:44.865539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:44.865596 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:44.895590 1201669 cri.go:89] found id: ""
	I1218 00:47:44.895603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.895610 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:44.895615 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:44.895678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:44.930517 1201669 cri.go:89] found id: ""
	I1218 00:47:44.930531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.930538 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:44.930544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:44.930602 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:44.963147 1201669 cri.go:89] found id: ""
	I1218 00:47:44.963161 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.963168 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:44.963176 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:44.963187 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:45.068693 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:45.068706 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:45.068718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:45.150525 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:45.150547 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:45.198775 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:45.198795 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:45.282633 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:45.282655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:47.798966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:47.809011 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:47.809070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:47.836141 1201669 cri.go:89] found id: ""
	I1218 00:47:47.836155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.836161 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:47.836167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:47.836256 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:47.862554 1201669 cri.go:89] found id: ""
	I1218 00:47:47.862568 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.862575 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:47.862580 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:47.862645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:47.889972 1201669 cri.go:89] found id: ""
	I1218 00:47:47.889986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.889992 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:47.889997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:47.890054 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:47.922142 1201669 cri.go:89] found id: ""
	I1218 00:47:47.922155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.922162 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:47.922168 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:47.922223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:47.956979 1201669 cri.go:89] found id: ""
	I1218 00:47:47.956993 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.956999 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:47.957005 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:47.957062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:47.982938 1201669 cri.go:89] found id: ""
	I1218 00:47:47.982952 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.982959 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:47.982965 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:47.983027 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:48.014164 1201669 cri.go:89] found id: ""
	I1218 00:47:48.014178 1201669 logs.go:282] 0 containers: []
	W1218 00:47:48.014184 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:48.014192 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:48.014205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:48.078819 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:48.078831 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:48.078850 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:48.151018 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:48.151045 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:48.178919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:48.178937 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:48.246806 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:48.246828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:50.762650 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:50.772894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:50.772953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:50.798440 1201669 cri.go:89] found id: ""
	I1218 00:47:50.798453 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.798459 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:50.798468 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:50.798525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:50.824627 1201669 cri.go:89] found id: ""
	I1218 00:47:50.824641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.824648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:50.824654 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:50.824713 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:50.849720 1201669 cri.go:89] found id: ""
	I1218 00:47:50.849732 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.849740 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:50.849745 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:50.849802 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:50.873828 1201669 cri.go:89] found id: ""
	I1218 00:47:50.873841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.873849 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:50.873854 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:50.873910 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:50.905379 1201669 cri.go:89] found id: ""
	I1218 00:47:50.905392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.905399 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:50.905404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:50.905461 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:50.935677 1201669 cri.go:89] found id: ""
	I1218 00:47:50.935690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.935697 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:50.935702 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:50.935774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:50.970057 1201669 cri.go:89] found id: ""
	I1218 00:47:50.970070 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.970077 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:50.970085 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:50.970095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:51.036789 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:51.036810 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:51.051895 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:51.051913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:51.116641 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:51.116651 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:51.116663 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:51.186315 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:51.186337 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:53.718450 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:53.728262 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:53.728318 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:53.755772 1201669 cri.go:89] found id: ""
	I1218 00:47:53.755787 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.755793 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:53.755798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:53.755855 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:53.780839 1201669 cri.go:89] found id: ""
	I1218 00:47:53.780853 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.780860 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:53.780865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:53.780929 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:53.806552 1201669 cri.go:89] found id: ""
	I1218 00:47:53.806603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.806611 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:53.806616 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:53.806672 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:53.832361 1201669 cri.go:89] found id: ""
	I1218 00:47:53.832380 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.832401 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:53.832420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:53.832492 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:53.859241 1201669 cri.go:89] found id: ""
	I1218 00:47:53.859254 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.859262 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:53.859277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:53.859335 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:53.884714 1201669 cri.go:89] found id: ""
	I1218 00:47:53.884728 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.884735 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:53.884740 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:53.884803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:53.921003 1201669 cri.go:89] found id: ""
	I1218 00:47:53.921016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.921024 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:53.921031 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:53.921041 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:54.003954 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:54.003975 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:54.020878 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:54.020896 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:54.086911 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:54.086921 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:54.086943 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:54.157859 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:54.157878 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:56.687608 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:56.697675 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:56.697732 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:56.722032 1201669 cri.go:89] found id: ""
	I1218 00:47:56.722045 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.722053 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:56.722058 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:56.722113 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:56.746685 1201669 cri.go:89] found id: ""
	I1218 00:47:56.746698 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.746705 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:56.746712 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:56.746769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:56.771487 1201669 cri.go:89] found id: ""
	I1218 00:47:56.771500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.771508 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:56.771515 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:56.771571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:56.795765 1201669 cri.go:89] found id: ""
	I1218 00:47:56.795778 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.795785 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:56.795790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:56.795845 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:56.820457 1201669 cri.go:89] found id: ""
	I1218 00:47:56.820470 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.820477 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:56.820482 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:56.820543 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:56.844750 1201669 cri.go:89] found id: ""
	I1218 00:47:56.844764 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.844788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:56.844794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:56.844859 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:56.870299 1201669 cri.go:89] found id: ""
	I1218 00:47:56.870312 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.870319 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:56.870326 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:56.870336 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:56.957977 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:56.957986 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:56.957996 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:57.026903 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:57.026922 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:57.056057 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:57.056072 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:57.122322 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:57.122341 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.637384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:59.647089 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:59.647147 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:59.675785 1201669 cri.go:89] found id: ""
	I1218 00:47:59.675798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.675805 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:59.675811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:59.675868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:59.700863 1201669 cri.go:89] found id: ""
	I1218 00:47:59.700876 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.700883 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:59.700888 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:59.700951 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:59.726366 1201669 cri.go:89] found id: ""
	I1218 00:47:59.726379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.726388 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:59.726394 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:59.726449 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:59.754806 1201669 cri.go:89] found id: ""
	I1218 00:47:59.754819 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.754826 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:59.754832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:59.754887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:59.779823 1201669 cri.go:89] found id: ""
	I1218 00:47:59.779842 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.779850 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:59.779855 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:59.779931 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:59.809497 1201669 cri.go:89] found id: ""
	I1218 00:47:59.809511 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.809519 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:59.809524 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:59.809580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:59.834274 1201669 cri.go:89] found id: ""
	I1218 00:47:59.834287 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.834294 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:59.834302 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:59.834312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:59.908086 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:59.908107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.923555 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:59.923571 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:59.996659 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:59.996668 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:59.996679 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:48:00.245332 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:48:00.245355 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:48:02.854946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:48:02.865088 1201669 kubeadm.go:602] duration metric: took 4m2.280648529s to restartPrimaryControlPlane
	W1218 00:48:02.865154 1201669 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1218 00:48:02.865291 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:48:03.285302 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:48:03.298386 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:48:03.307630 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:48:03.307686 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:48:03.316384 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:48:03.316392 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:48:03.316448 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:48:03.324266 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:48:03.324330 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:48:03.332001 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:48:03.339756 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:48:03.339811 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:48:03.347895 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.356395 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:48:03.356451 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.364239 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:48:03.373496 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:48:03.373555 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:48:03.380932 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:48:03.422222 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:48:03.422277 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:48:03.498554 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:48:03.498619 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:48:03.498653 1201669 kubeadm.go:319] OS: Linux
	I1218 00:48:03.498697 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:48:03.498750 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:48:03.498797 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:48:03.498844 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:48:03.498890 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:48:03.498939 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:48:03.498983 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:48:03.499030 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:48:03.499077 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:48:03.575694 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:48:03.575807 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:48:03.575895 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:48:03.584731 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:48:03.590040 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:48:03.590125 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:48:03.590198 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:48:03.590273 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:48:03.590332 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:48:03.590401 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:48:03.590455 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:48:03.590517 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:48:03.590577 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:48:03.590649 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:48:03.590726 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:48:03.590762 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:48:03.590820 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:48:03.968959 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:48:04.492311 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:48:04.657077 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:48:05.347391 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:48:06.111689 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:48:06.112246 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:48:06.114858 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:48:06.118151 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:48:06.118267 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:48:06.118369 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:48:06.118440 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:48:06.133862 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:48:06.134164 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:48:06.143224 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:48:06.143316 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:48:06.143354 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:48:06.274772 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:48:06.274905 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:52:06.274474 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000113635s
	I1218 00:52:06.274499 1201669 kubeadm.go:319] 
	I1218 00:52:06.274555 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:52:06.274586 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:52:06.274697 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:52:06.274703 1201669 kubeadm.go:319] 
	I1218 00:52:06.274816 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:52:06.274846 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:52:06.274874 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:52:06.274877 1201669 kubeadm.go:319] 
	I1218 00:52:06.279422 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:52:06.279849 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:52:06.279958 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:52:06.280242 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1218 00:52:06.280248 1201669 kubeadm.go:319] 
	I1218 00:52:06.280323 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1218 00:52:06.280425 1201669 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000113635s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1218 00:52:06.280513 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:52:06.687216 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:52:06.699735 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:52:06.699788 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:52:06.707587 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:52:06.707598 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:52:06.707647 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:52:06.715175 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:52:06.715229 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:52:06.722487 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:52:06.729668 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:52:06.729722 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:52:06.736814 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.744131 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:52:06.744183 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.751469 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:52:06.758728 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:52:06.758782 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:52:06.765652 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:52:06.801363 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:52:06.801639 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:52:06.871618 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:52:06.871677 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:52:06.871709 1201669 kubeadm.go:319] OS: Linux
	I1218 00:52:06.871750 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:52:06.871795 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:52:06.871839 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:52:06.871883 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:52:06.871926 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:52:06.871970 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:52:06.872012 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:52:06.872056 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:52:06.872097 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:52:06.943596 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:52:06.943710 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:52:06.943809 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:52:06.952719 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:52:06.957986 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:52:06.958071 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:52:06.958134 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:52:06.958209 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:52:06.958270 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:52:06.958342 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:52:06.958395 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:52:06.958469 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:52:06.958529 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:52:06.958603 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:52:06.958674 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:52:06.958710 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:52:06.958765 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:52:07.159266 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:52:07.543682 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:52:07.621245 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:52:07.789755 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:52:08.258810 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:52:08.259464 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:52:08.262206 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:52:08.265520 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:52:08.265615 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:52:08.265696 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:52:08.266218 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:52:08.282138 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:52:08.282258 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:52:08.290066 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:52:08.290407 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:52:08.290607 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:52:08.422232 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:52:08.422344 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:56:08.423339 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001129518s
	I1218 00:56:08.423364 1201669 kubeadm.go:319] 
	I1218 00:56:08.423420 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:56:08.423452 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:56:08.423565 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:56:08.423570 1201669 kubeadm.go:319] 
	I1218 00:56:08.423755 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:56:08.423825 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:56:08.423872 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:56:08.423876 1201669 kubeadm.go:319] 
	I1218 00:56:08.428596 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:56:08.429049 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:56:08.429151 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:56:08.429380 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 00:56:08.429383 1201669 kubeadm.go:319] 
	I1218 00:56:08.429447 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 00:56:08.429502 1201669 kubeadm.go:403] duration metric: took 12m7.881074518s to StartCluster
	I1218 00:56:08.429533 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:56:08.429592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:56:08.454446 1201669 cri.go:89] found id: ""
	I1218 00:56:08.454459 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.454467 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:56:08.454472 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:56:08.454527 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:56:08.479309 1201669 cri.go:89] found id: ""
	I1218 00:56:08.479323 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.479330 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:56:08.479335 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:56:08.479395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:56:08.506727 1201669 cri.go:89] found id: ""
	I1218 00:56:08.506740 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.506747 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:56:08.506752 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:56:08.506809 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:56:08.531214 1201669 cri.go:89] found id: ""
	I1218 00:56:08.531228 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.531235 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:56:08.531240 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:56:08.531295 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:56:08.555634 1201669 cri.go:89] found id: ""
	I1218 00:56:08.555647 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.555654 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:56:08.555659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:56:08.555716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:56:08.580409 1201669 cri.go:89] found id: ""
	I1218 00:56:08.580423 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.580430 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:56:08.580435 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:56:08.580494 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:56:08.605063 1201669 cri.go:89] found id: ""
	I1218 00:56:08.605089 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.605096 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:56:08.605105 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:56:08.605116 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:56:08.684346 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:56:08.684356 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:56:08.684367 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:56:08.760495 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:56:08.760515 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:56:08.787919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:56:08.787936 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:56:08.853642 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:56:08.853661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	W1218 00:56:08.868901 1201669 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 00:56:08.868939 1201669 out.go:285] * 
	W1218 00:56:08.868999 1201669 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.869015 1201669 out.go:285] * 
	W1218 00:56:08.871456 1201669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:56:08.877860 1201669 out.go:203] 
	W1218 00:56:08.880779 1201669 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.880832 1201669 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 00:56:08.880854 1201669 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 00:56:08.883989 1201669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113118431Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113153129Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113189559Z" level=info msg="Create NRI interface"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113282086Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113290964Z" level=info msg="runtime interface created"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113301647Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113309343Z" level=info msg="runtime interface starting up..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113315505Z" level=info msg="starting plugins..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113327796Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.11339067Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:43:59 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.578897723Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=a394bef7-706e-4c2b-a83c-e7a192425f8f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.579569606Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=0b73d3f0-8cf4-4881-9be6-303c65310a78 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58003914Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=ba617c6c-560d-48a4-8069-49b5cad617df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58069138Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=1b435c90-bcae-4d5e-85b5-8f24b84aad77 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581151364Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=1ca4dc15-0b08-49d0-89ca-728ba68fd7be name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581562446Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=9758cff4-6113-4178-8c9f-4ef34a0e91ee name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581979017Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=34a384cc-3abb-4525-b194-0557e1231baf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:56:10.115202   21244 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:10.115926   21244 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:10.117743   21244 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:10.118578   21244 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:10.119614   21244 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:56:10 up  7:38,  0 user,  load average: 0.03, 0.16, 0.38
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:56:07 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:56:08 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2128.
	Dec 18 00:56:08 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:08 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:08 functional-288604 kubelet[21119]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:08 functional-288604 kubelet[21119]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:08 functional-288604 kubelet[21119]: E1218 00:56:08.706789   21119 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:56:08 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:56:08 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:56:09 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2129.
	Dec 18 00:56:09 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:09 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:09 functional-288604 kubelet[21159]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:09 functional-288604 kubelet[21159]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:09 functional-288604 kubelet[21159]: E1218 00:56:09.470823   21159 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:56:09 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:56:09 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2130.
	Dec 18 00:56:10 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:10 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:10 functional-288604 kubelet[21249]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:10 functional-288604 kubelet[21249]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:10 functional-288604 kubelet[21249]: E1218 00:56:10.188585   21249 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (358.600703ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig (735.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth (2.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-288604 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: (dbg) Non-zero exit: kubectl --context functional-288604 get po -l tier=control-plane -n kube-system -o=json: exit status 1 (57.242445ms)

                                                
                                                
-- stdout --
	{
	    "apiVersion": "v1",
	    "items": [],
	    "kind": "List",
	    "metadata": {
	        "resourceVersion": ""
	    }
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:827: failed to get components. args "kubectl --context functional-288604 get po -l tier=control-plane -n kube-system -o=json": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (312.66ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                      ARGS                                                                       │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-240845 ssh pgrep buildkitd                                                                                                           │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ image   │ functional-240845 image ls --format yaml --alsologtostderr                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr                                          │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format table --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls --format short --alsologtostderr                                                                                     │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ image   │ functional-240845 image ls                                                                                                                      │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ delete  │ -p functional-240845                                                                                                                            │ functional-240845 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │ 18 Dec 25 00:29 UTC │
	│ start   │ -p functional-288604 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:29 UTC │                     │
	│ start   │ -p functional-288604 --alsologtostderr -v=8                                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:37 UTC │                     │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.1                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:3.3                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add registry.k8s.io/pause:latest                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache add minikube-local-cache-test:functional-288604                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ functional-288604 cache delete minikube-local-cache-test:functional-288604                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ list                                                                                                                                            │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl images                                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                              │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ cache   │ functional-288604 cache reload                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ kubectl │ functional-288604 kubectl -- --context functional-288604 get pods                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ start   │ -p functional-288604 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:43:55
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:43:55.978742 1201669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:43:55.978849 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.978853 1201669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:43:55.978857 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.979124 1201669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:43:55.979466 1201669 out.go:368] Setting JSON to false
	I1218 00:43:55.980315 1201669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26784,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:43:55.980372 1201669 start.go:143] virtualization:  
	I1218 00:43:55.983789 1201669 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:43:55.987542 1201669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:43:55.987604 1201669 notify.go:221] Checking for updates...
	I1218 00:43:55.993164 1201669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:43:55.995954 1201669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:43:55.999614 1201669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:43:56.002831 1201669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:43:56.005802 1201669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:43:56.009212 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:56.009315 1201669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:43:56.041210 1201669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:43:56.041338 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.105588 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.095254501 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.105683 1201669 docker.go:319] overlay module found
	I1218 00:43:56.108792 1201669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:43:56.111628 1201669 start.go:309] selected driver: docker
	I1218 00:43:56.111638 1201669 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.111765 1201669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:43:56.111873 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.170180 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.160520969 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.170597 1201669 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:43:56.170621 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:56.170672 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:56.170715 1201669 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.173990 1201669 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:43:56.177055 1201669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:43:56.179992 1201669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:43:56.182847 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:56.182889 1201669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:43:56.182897 1201669 cache.go:65] Caching tarball of preloaded images
	I1218 00:43:56.182969 1201669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:43:56.182979 1201669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:43:56.182988 1201669 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:43:56.183103 1201669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:43:56.202673 1201669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:43:56.202684 1201669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:43:56.202702 1201669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:43:56.202743 1201669 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:43:56.202797 1201669 start.go:364] duration metric: took 37.488µs to acquireMachinesLock for "functional-288604"
	I1218 00:43:56.202818 1201669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:43:56.202823 1201669 fix.go:54] fixHost starting: 
	I1218 00:43:56.203129 1201669 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:43:56.220546 1201669 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:43:56.220565 1201669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:43:56.223742 1201669 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:43:56.223770 1201669 machine.go:94] provisionDockerMachine start ...
	I1218 00:43:56.223861 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.243517 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.243858 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.243865 1201669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:43:56.399607 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.399622 1201669 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:43:56.399683 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.417287 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.417598 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.417605 1201669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:43:56.583098 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.583184 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.603369 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.603669 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.603683 1201669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:43:56.772929 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:43:56.772944 1201669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:43:56.772975 1201669 ubuntu.go:190] setting up certificates
	I1218 00:43:56.772989 1201669 provision.go:84] configureAuth start
	I1218 00:43:56.773070 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:56.789980 1201669 provision.go:143] copyHostCerts
	I1218 00:43:56.790044 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:43:56.790056 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:43:56.790131 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:43:56.790231 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:43:56.790235 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:43:56.790260 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:43:56.790310 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:43:56.790313 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:43:56.790335 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:43:56.790376 1201669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:43:56.986120 1201669 provision.go:177] copyRemoteCerts
	I1218 00:43:56.986182 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:43:56.986224 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.010906 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.115839 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:43:57.132835 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:43:57.150663 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:43:57.167535 1201669 provision.go:87] duration metric: took 394.523589ms to configureAuth
	I1218 00:43:57.167552 1201669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:43:57.167745 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:57.167846 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.184649 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:57.184955 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:57.184966 1201669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:43:57.547661 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:43:57.547677 1201669 machine.go:97] duration metric: took 1.323900056s to provisionDockerMachine
	I1218 00:43:57.547689 1201669 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:43:57.547701 1201669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:43:57.547767 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:43:57.547816 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.568532 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.675839 1201669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:43:57.679095 1201669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:43:57.679112 1201669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:43:57.679121 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:43:57.679176 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:43:57.679251 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:43:57.679324 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:43:57.679367 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:43:57.686719 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:57.703522 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:43:57.720871 1201669 start.go:296] duration metric: took 173.166293ms for postStartSetup
	I1218 00:43:57.720943 1201669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:43:57.720983 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.737854 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.841489 1201669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:43:57.846519 1201669 fix.go:56] duration metric: took 1.643688341s for fixHost
	I1218 00:43:57.846534 1201669 start.go:83] releasing machines lock for "functional-288604", held for 1.6437309s
	I1218 00:43:57.846614 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:57.862813 1201669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:43:57.862836 1201669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:43:57.862859 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.862906 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.880942 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.881296 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.984097 1201669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:43:58.077458 1201669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:43:58.117786 1201669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:43:58.128203 1201669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:43:58.128283 1201669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:43:58.137853 1201669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:43:58.137867 1201669 start.go:496] detecting cgroup driver to use...
	I1218 00:43:58.137898 1201669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:43:58.137955 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:43:58.154333 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:43:58.171243 1201669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:43:58.171317 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:43:58.187629 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:43:58.200443 1201669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:43:58.332309 1201669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:43:58.456320 1201669 docker.go:234] disabling docker service ...
	I1218 00:43:58.456386 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:43:58.471261 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:43:58.484090 1201669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:43:58.600872 1201669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:43:58.712059 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:43:58.725312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:43:58.738398 1201669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:43:58.738467 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.746850 1201669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:43:58.746917 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.755273 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.763400 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.771727 1201669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:43:58.779324 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.788210 1201669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.796348 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.804389 1201669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:43:58.811403 1201669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:43:58.818408 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:58.951912 1201669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:43:59.118783 1201669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:43:59.118849 1201669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:43:59.122545 1201669 start.go:564] Will wait 60s for crictl version
	I1218 00:43:59.122604 1201669 ssh_runner.go:195] Run: which crictl
	I1218 00:43:59.126019 1201669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:43:59.148982 1201669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:43:59.149067 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.175940 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.206912 1201669 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:43:59.209698 1201669 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:43:59.225649 1201669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:43:59.232549 1201669 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1218 00:43:59.235431 1201669 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disabl
eOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:43:59.235543 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:59.235614 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.273407 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.273418 1201669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:43:59.273471 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.299275 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.299287 1201669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:43:59.299293 1201669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:43:59.299404 1201669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:43:59.299490 1201669 ssh_runner.go:195] Run: crio config
	I1218 00:43:59.362084 1201669 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1218 00:43:59.362106 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:59.362113 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:59.362126 1201669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:43:59.362149 1201669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOp
ts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:43:59.362277 1201669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:43:59.362352 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:43:59.369805 1201669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:43:59.369864 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:43:59.376968 1201669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:43:59.388765 1201669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:43:59.400454 1201669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2069 bytes)
	I1218 00:43:59.412514 1201669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:43:59.416040 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:59.531606 1201669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:43:59.640794 1201669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:43:59.640805 1201669 certs.go:195] generating shared ca certs ...
	I1218 00:43:59.640830 1201669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:43:59.640959 1201669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:43:59.641001 1201669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:43:59.641007 1201669 certs.go:257] generating profile certs ...
	I1218 00:43:59.641121 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:43:59.641164 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:43:59.641201 1201669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:43:59.641309 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:43:59.641337 1201669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:43:59.641343 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:43:59.641373 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:43:59.641395 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:43:59.641423 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:43:59.641463 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:59.642073 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:43:59.660992 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:43:59.679818 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:43:59.699150 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:43:59.718895 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:43:59.738413 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:43:59.756315 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:43:59.773826 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:43:59.791059 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:43:59.807447 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:43:59.824212 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:43:59.841186 1201669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:43:59.853492 1201669 ssh_runner.go:195] Run: openssl version
	I1218 00:43:59.859998 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.866869 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:43:59.873885 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877278 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877331 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.917714 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:43:59.925047 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.932048 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:43:59.939101 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942813 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942866 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.983421 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:43:59.990593 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:43:59.997725 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:44:00.042943 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059312 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059393 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.179416 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:44:00.199517 1201669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:44:00.211411 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:44:00.299862 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:44:00.347783 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:44:00.400161 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:44:00.445236 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:44:00.505288 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:44:00.548440 1201669 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:44:00.548538 1201669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:44:00.548659 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.576531 1201669 cri.go:89] found id: ""
	I1218 00:44:00.576602 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:44:00.584414 1201669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:44:00.584430 1201669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:44:00.584481 1201669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:44:00.591678 1201669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.592197 1201669 kubeconfig.go:125] found "functional-288604" server: "https://192.168.49.2:8441"
	I1218 00:44:00.593407 1201669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:44:00.601066 1201669 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-18 00:29:23.211763247 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-18 00:43:59.405160305 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1218 00:44:00.601075 1201669 kubeadm.go:1161] stopping kube-system containers ...
	I1218 00:44:00.601085 1201669 cri.go:54] listing CRI containers in root : {State:all Name: Namespaces:[kube-system]}
	I1218 00:44:00.601140 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.626991 1201669 cri.go:89] found id: ""
	I1218 00:44:00.627065 1201669 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1218 00:44:00.640495 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:44:00.648256 1201669 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 18 00:33 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec 18 00:33 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 18 00:33 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 18 00:33 /etc/kubernetes/scheduler.conf
	
	I1218 00:44:00.648311 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:44:00.655772 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:44:00.663347 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.663410 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:44:00.670748 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.677977 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.678031 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.685079 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:44:00.692996 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.693049 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:44:00.700106 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:44:00.707647 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:00.751682 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:01.971643 1201669 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.219916809s)
	I1218 00:44:01.971736 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.213563 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.279593 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.331094 1201669 api_server.go:52] waiting for apiserver process to appear ...
	I1218 00:44:02.331177 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:02.831338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.332205 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.831381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.331525 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.832325 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.331379 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.332243 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.831869 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.331354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.831326 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.331942 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.831354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.331370 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.832255 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.331366 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.831363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.831359 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.331357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.331990 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.831891 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.331340 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.832123 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.331341 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.332060 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.831352 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.331755 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.831466 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.331860 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.831293 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.831369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.331585 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.832126 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.331328 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.831986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.331369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.831627 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.331975 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.831268 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.331992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.831394 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.331896 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.831502 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.331383 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.831706 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.332082 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.831353 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.331380 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.832133 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.331347 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.831351 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.332001 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.831800 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.331774 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.831372 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.332276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.832017 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.331329 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.832065 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.331713 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.831374 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.331600 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.332164 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.831455 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.331933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.831358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.332063 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.831460 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.331554 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.832152 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.331280 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.831272 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.332273 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.832020 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.331662 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.831758 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.331412 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.831371 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.332088 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.831480 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.332490 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.832201 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.331816 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.831276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.331408 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.831739 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.331262 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.831814 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.332083 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.832108 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.331984 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.831507 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.331363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.831505 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.332120 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.831384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.332279 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.831590 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.331361 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.831933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.331338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.332254 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.832148 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.331950 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.831349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.332302 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.832264 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.331912 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.832145 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.331498 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.831848 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.331497 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:02.332289 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:02.332395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:02.358402 1201669 cri.go:89] found id: ""
	I1218 00:45:02.358416 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.358424 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:02.358429 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:02.358493 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:02.386799 1201669 cri.go:89] found id: ""
	I1218 00:45:02.386814 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.386821 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:02.386825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:02.386882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:02.419430 1201669 cri.go:89] found id: ""
	I1218 00:45:02.419445 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.419453 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:02.419460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:02.419560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:02.445313 1201669 cri.go:89] found id: ""
	I1218 00:45:02.445326 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.445333 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:02.445338 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:02.445395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:02.474189 1201669 cri.go:89] found id: ""
	I1218 00:45:02.474203 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.474210 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:02.474215 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:02.474278 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:02.501782 1201669 cri.go:89] found id: ""
	I1218 00:45:02.501796 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.501803 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:02.501808 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:02.501867 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:02.531648 1201669 cri.go:89] found id: ""
	I1218 00:45:02.531662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.531669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:02.531677 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:02.531690 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:02.597077 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:02.597095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:02.612827 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:02.612845 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:02.680833 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:02.680844 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:02.680855 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:02.749861 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:02.749884 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:05.287966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:05.298109 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:05.298171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:05.323714 1201669 cri.go:89] found id: ""
	I1218 00:45:05.323727 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.323733 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:05.323739 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:05.323800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:05.348520 1201669 cri.go:89] found id: ""
	I1218 00:45:05.348534 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.348541 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:05.348546 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:05.348604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:05.373275 1201669 cri.go:89] found id: ""
	I1218 00:45:05.373290 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.373297 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:05.373302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:05.373362 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:05.397833 1201669 cri.go:89] found id: ""
	I1218 00:45:05.397846 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.397853 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:05.397859 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:05.397921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:05.422938 1201669 cri.go:89] found id: ""
	I1218 00:45:05.422952 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.422959 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:05.422964 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:05.423026 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:05.451027 1201669 cri.go:89] found id: ""
	I1218 00:45:05.451041 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.451048 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:05.451053 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:05.451115 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:05.477082 1201669 cri.go:89] found id: ""
	I1218 00:45:05.477096 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.477102 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:05.477110 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:05.477120 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:05.543065 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:05.543083 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:05.558032 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:05.558047 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:05.623058 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:05.623071 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:05.623081 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:05.694967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:05.694987 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.224381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:08.234565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:08.234639 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:08.262640 1201669 cri.go:89] found id: ""
	I1218 00:45:08.262654 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.262661 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:08.262667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:08.262724 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:08.288384 1201669 cri.go:89] found id: ""
	I1218 00:45:08.288397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.288404 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:08.288409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:08.288468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:08.314880 1201669 cri.go:89] found id: ""
	I1218 00:45:08.314893 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.314900 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:08.314911 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:08.314971 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:08.340105 1201669 cri.go:89] found id: ""
	I1218 00:45:08.340119 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.340125 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:08.340131 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:08.340202 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:08.370009 1201669 cri.go:89] found id: ""
	I1218 00:45:08.370023 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.370030 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:08.370035 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:08.370094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:08.394925 1201669 cri.go:89] found id: ""
	I1218 00:45:08.394939 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.394946 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:08.394951 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:08.395013 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:08.419448 1201669 cri.go:89] found id: ""
	I1218 00:45:08.419462 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.419469 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:08.419477 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:08.419487 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:08.493271 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:08.493290 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.521236 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:08.521251 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:08.591011 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:08.591030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:08.605700 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:08.605716 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:08.674615 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.175336 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:11.186731 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:11.186790 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:11.217495 1201669 cri.go:89] found id: ""
	I1218 00:45:11.217510 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.217517 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:11.217522 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:11.217579 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:11.242494 1201669 cri.go:89] found id: ""
	I1218 00:45:11.242506 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.242514 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:11.242519 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:11.242588 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:11.269562 1201669 cri.go:89] found id: ""
	I1218 00:45:11.269576 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.269583 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:11.269588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:11.269646 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:11.296483 1201669 cri.go:89] found id: ""
	I1218 00:45:11.296497 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.296503 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:11.296517 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:11.296573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:11.324023 1201669 cri.go:89] found id: ""
	I1218 00:45:11.324037 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.324044 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:11.324049 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:11.324107 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:11.350813 1201669 cri.go:89] found id: ""
	I1218 00:45:11.350826 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.350833 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:11.350838 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:11.350915 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:11.375508 1201669 cri.go:89] found id: ""
	I1218 00:45:11.375522 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.375529 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:11.375538 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:11.375548 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:11.443170 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:11.443196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:11.458193 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:11.458209 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:11.526119 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.526129 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:11.526139 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:11.598390 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:11.598409 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:14.127470 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:14.140176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:14.140248 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:14.175467 1201669 cri.go:89] found id: ""
	I1218 00:45:14.175481 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.175488 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:14.175493 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:14.175550 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:14.205623 1201669 cri.go:89] found id: ""
	I1218 00:45:14.205637 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.205649 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:14.205655 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:14.205727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:14.232765 1201669 cri.go:89] found id: ""
	I1218 00:45:14.232779 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.232786 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:14.232790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:14.232848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:14.259382 1201669 cri.go:89] found id: ""
	I1218 00:45:14.259396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.259403 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:14.259408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:14.259465 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:14.284118 1201669 cri.go:89] found id: ""
	I1218 00:45:14.284132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.284139 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:14.284144 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:14.284205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:14.308510 1201669 cri.go:89] found id: ""
	I1218 00:45:14.308530 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.308536 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:14.308552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:14.308619 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:14.336798 1201669 cri.go:89] found id: ""
	I1218 00:45:14.336811 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.336819 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:14.336826 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:14.336837 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:14.402054 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:14.402074 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:14.416289 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:14.416306 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:14.480242 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:14.480255 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:14.480265 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:14.549733 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:14.549753 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:17.078515 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:17.088248 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:17.088306 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:17.112977 1201669 cri.go:89] found id: ""
	I1218 00:45:17.112990 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.112998 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:17.113004 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:17.113062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:17.137141 1201669 cri.go:89] found id: ""
	I1218 00:45:17.137154 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.137161 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:17.137167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:17.137223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:17.166013 1201669 cri.go:89] found id: ""
	I1218 00:45:17.166026 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.166033 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:17.166038 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:17.166098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:17.194884 1201669 cri.go:89] found id: ""
	I1218 00:45:17.194906 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.194920 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:17.194925 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:17.194990 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:17.220329 1201669 cri.go:89] found id: ""
	I1218 00:45:17.220342 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.220349 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:17.220354 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:17.220415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:17.248333 1201669 cri.go:89] found id: ""
	I1218 00:45:17.248347 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.248353 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:17.248359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:17.248415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:17.273057 1201669 cri.go:89] found id: ""
	I1218 00:45:17.273074 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.273084 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:17.273093 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:17.273104 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:17.339448 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:17.339467 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:17.354635 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:17.354652 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:17.422682 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:17.422703 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:17.422714 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:17.490930 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:17.490951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.021992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:20.032625 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:20.032687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:20.060698 1201669 cri.go:89] found id: ""
	I1218 00:45:20.060712 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.060719 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:20.060724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:20.060785 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:20.086680 1201669 cri.go:89] found id: ""
	I1218 00:45:20.086694 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.086701 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:20.086706 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:20.086766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:20.112553 1201669 cri.go:89] found id: ""
	I1218 00:45:20.112567 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.112574 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:20.112579 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:20.112642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:20.137056 1201669 cri.go:89] found id: ""
	I1218 00:45:20.137070 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.137077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:20.137082 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:20.137148 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:20.175745 1201669 cri.go:89] found id: ""
	I1218 00:45:20.175758 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.175775 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:20.175780 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:20.175848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:20.205557 1201669 cri.go:89] found id: ""
	I1218 00:45:20.205570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.205578 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:20.205583 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:20.205645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:20.234726 1201669 cri.go:89] found id: ""
	I1218 00:45:20.234739 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.234746 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:20.234754 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:20.234773 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:20.303025 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:20.303044 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.331096 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:20.331118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:20.397831 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:20.397856 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:20.412745 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:20.412761 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:20.480267 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:22.980543 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:22.990690 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:22.990747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:23.016762 1201669 cri.go:89] found id: ""
	I1218 00:45:23.016795 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.016802 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:23.016807 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:23.016868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:23.042294 1201669 cri.go:89] found id: ""
	I1218 00:45:23.042308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.042315 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:23.042320 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:23.042379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:23.071377 1201669 cri.go:89] found id: ""
	I1218 00:45:23.071392 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.071399 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:23.071405 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:23.071463 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:23.096911 1201669 cri.go:89] found id: ""
	I1218 00:45:23.096925 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.096932 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:23.096938 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:23.097002 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:23.123350 1201669 cri.go:89] found id: ""
	I1218 00:45:23.123363 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.123370 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:23.123375 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:23.123455 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:23.156384 1201669 cri.go:89] found id: ""
	I1218 00:45:23.156397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.156404 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:23.156409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:23.156470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:23.198764 1201669 cri.go:89] found id: ""
	I1218 00:45:23.198777 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.198784 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:23.198792 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:23.198802 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:23.276991 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:23.277016 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:23.305929 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:23.305946 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:23.374243 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:23.374263 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:23.389391 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:23.389408 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:23.455741 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:25.956010 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:25.966329 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:25.966402 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:25.992362 1201669 cri.go:89] found id: ""
	I1218 00:45:25.992376 1201669 logs.go:282] 0 containers: []
	W1218 00:45:25.992383 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:25.992388 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:25.992446 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:26.020474 1201669 cri.go:89] found id: ""
	I1218 00:45:26.020487 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.020495 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:26.020500 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:26.020562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:26.053060 1201669 cri.go:89] found id: ""
	I1218 00:45:26.053083 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.053090 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:26.053096 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:26.053168 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:26.080555 1201669 cri.go:89] found id: ""
	I1218 00:45:26.080570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.080577 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:26.080582 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:26.080642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:26.106383 1201669 cri.go:89] found id: ""
	I1218 00:45:26.106396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.106405 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:26.106413 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:26.106472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:26.133033 1201669 cri.go:89] found id: ""
	I1218 00:45:26.133046 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.133053 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:26.133059 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:26.133114 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:26.166644 1201669 cri.go:89] found id: ""
	I1218 00:45:26.166662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.166669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:26.166683 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:26.166693 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:26.249137 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:26.249156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:26.266352 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:26.266372 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:26.337214 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:26.337225 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:26.337235 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:26.407577 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:26.407597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:28.937809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:28.947798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:28.947860 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:28.972642 1201669 cri.go:89] found id: ""
	I1218 00:45:28.972655 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.972662 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:28.972667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:28.972727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:28.997812 1201669 cri.go:89] found id: ""
	I1218 00:45:28.997827 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.997834 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:28.997839 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:28.997897 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:29.025172 1201669 cri.go:89] found id: ""
	I1218 00:45:29.025188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.025195 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:29.025200 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:29.025261 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:29.050129 1201669 cri.go:89] found id: ""
	I1218 00:45:29.050143 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.050151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:29.050156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:29.050216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:29.074056 1201669 cri.go:89] found id: ""
	I1218 00:45:29.074069 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.074076 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:29.074081 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:29.074138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:29.102343 1201669 cri.go:89] found id: ""
	I1218 00:45:29.102356 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.102363 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:29.102369 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:29.102426 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:29.126969 1201669 cri.go:89] found id: ""
	I1218 00:45:29.126982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.126989 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:29.126996 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:29.127007 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:29.201687 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:29.201704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:29.216680 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:29.216696 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:29.290639 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:29.290648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:29.290670 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:29.363990 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:29.364013 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:31.902567 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:31.912532 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:31.912590 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:31.938305 1201669 cri.go:89] found id: ""
	I1218 00:45:31.938319 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.938326 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:31.938331 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:31.938387 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:31.962545 1201669 cri.go:89] found id: ""
	I1218 00:45:31.962558 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.962565 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:31.962570 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:31.962632 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:31.987508 1201669 cri.go:89] found id: ""
	I1218 00:45:31.987521 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.987529 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:31.987534 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:31.987592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:32.014382 1201669 cri.go:89] found id: ""
	I1218 00:45:32.014395 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.014402 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:32.014408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:32.014474 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:32.041186 1201669 cri.go:89] found id: ""
	I1218 00:45:32.041200 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.041207 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:32.041212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:32.041271 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:32.067285 1201669 cri.go:89] found id: ""
	I1218 00:45:32.067308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.067316 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:32.067322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:32.067382 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:32.094234 1201669 cri.go:89] found id: ""
	I1218 00:45:32.094247 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.094254 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:32.094262 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:32.094272 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:32.164781 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:32.164800 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:32.197838 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:32.197854 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:32.268628 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:32.268648 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:32.282984 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:32.283001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:32.352888 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:34.853182 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:34.863312 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:34.863372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:34.887731 1201669 cri.go:89] found id: ""
	I1218 00:45:34.887745 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.887751 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:34.887756 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:34.887813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:34.913433 1201669 cri.go:89] found id: ""
	I1218 00:45:34.913446 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.913453 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:34.913458 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:34.913525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:34.938029 1201669 cri.go:89] found id: ""
	I1218 00:45:34.938043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.938050 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:34.938056 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:34.938125 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:34.963314 1201669 cri.go:89] found id: ""
	I1218 00:45:34.963327 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.963334 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:34.963339 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:34.963395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:34.991684 1201669 cri.go:89] found id: ""
	I1218 00:45:34.991699 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.991706 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:34.991711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:34.991775 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:35.019323 1201669 cri.go:89] found id: ""
	I1218 00:45:35.019338 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.019344 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:35.019350 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:35.019412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:35.044946 1201669 cri.go:89] found id: ""
	I1218 00:45:35.044960 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.044966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:35.044975 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:35.044986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:35.059688 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:35.059704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:35.127679 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:35.127700 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:35.127711 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:35.200793 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:35.200812 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:35.229277 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:35.229293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:37.797709 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:37.807597 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:37.807657 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:37.834367 1201669 cri.go:89] found id: ""
	I1218 00:45:37.834381 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.834399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:37.834404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:37.834466 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:37.862884 1201669 cri.go:89] found id: ""
	I1218 00:45:37.862898 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.862905 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:37.862910 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:37.862967 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:37.887715 1201669 cri.go:89] found id: ""
	I1218 00:45:37.887729 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.887736 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:37.887741 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:37.887800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:37.912412 1201669 cri.go:89] found id: ""
	I1218 00:45:37.912425 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.912432 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:37.912437 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:37.912500 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:37.936195 1201669 cri.go:89] found id: ""
	I1218 00:45:37.936209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.936216 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:37.936250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:37.936308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:37.964630 1201669 cri.go:89] found id: ""
	I1218 00:45:37.964645 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.964658 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:37.964663 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:37.964718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:37.997426 1201669 cri.go:89] found id: ""
	I1218 00:45:37.997439 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.997446 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:37.997454 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:37.997468 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:38.035686 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:38.035710 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:38.103558 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:38.103578 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:38.118520 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:38.118538 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:38.213391 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:38.213399 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:38.213410 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:40.782711 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:40.792421 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:40.792487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:40.816808 1201669 cri.go:89] found id: ""
	I1218 00:45:40.816821 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.816828 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:40.816833 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:40.816889 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:40.842296 1201669 cri.go:89] found id: ""
	I1218 00:45:40.842309 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.842316 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:40.842321 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:40.842381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:40.870550 1201669 cri.go:89] found id: ""
	I1218 00:45:40.870563 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.870570 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:40.870575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:40.870631 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:40.895987 1201669 cri.go:89] found id: ""
	I1218 00:45:40.896000 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.896007 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:40.896012 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:40.896071 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:40.922196 1201669 cri.go:89] found id: ""
	I1218 00:45:40.922209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.922217 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:40.922228 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:40.922287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:40.951012 1201669 cri.go:89] found id: ""
	I1218 00:45:40.951025 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.951032 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:40.951037 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:40.951094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:40.975029 1201669 cri.go:89] found id: ""
	I1218 00:45:40.975043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.975049 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:40.975057 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:40.975068 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:41.038362 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:41.038371 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:41.038383 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:41.106531 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:41.106550 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:41.133380 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:41.133396 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:41.202955 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:41.202974 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.720946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:43.730523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:43.730580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:43.758480 1201669 cri.go:89] found id: ""
	I1218 00:45:43.758494 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.758501 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:43.758506 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:43.758562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:43.782891 1201669 cri.go:89] found id: ""
	I1218 00:45:43.782904 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.782910 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:43.782915 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:43.782969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:43.807881 1201669 cri.go:89] found id: ""
	I1218 00:45:43.807895 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.807901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:43.807906 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:43.807962 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:43.831922 1201669 cri.go:89] found id: ""
	I1218 00:45:43.831934 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.831941 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:43.831946 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:43.832005 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:43.857303 1201669 cri.go:89] found id: ""
	I1218 00:45:43.857316 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.857323 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:43.857328 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:43.857385 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:43.882932 1201669 cri.go:89] found id: ""
	I1218 00:45:43.882945 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.882962 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:43.882967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:43.883034 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:43.910989 1201669 cri.go:89] found id: ""
	I1218 00:45:43.911003 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.911010 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:43.911017 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:43.911027 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:43.976855 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:43.976875 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.992065 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:43.992080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:44.066663 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:44.066673 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:44.066683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:44.136150 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:44.136169 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.674809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:46.685189 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:46.685253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:46.710336 1201669 cri.go:89] found id: ""
	I1218 00:45:46.710350 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.710357 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:46.710362 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:46.710423 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:46.735331 1201669 cri.go:89] found id: ""
	I1218 00:45:46.735344 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.735351 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:46.735356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:46.735412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:46.760113 1201669 cri.go:89] found id: ""
	I1218 00:45:46.760126 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.760133 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:46.760138 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:46.760192 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:46.785212 1201669 cri.go:89] found id: ""
	I1218 00:45:46.785225 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.785231 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:46.785237 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:46.785292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:46.810594 1201669 cri.go:89] found id: ""
	I1218 00:45:46.810607 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.810614 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:46.810619 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:46.810678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:46.835217 1201669 cri.go:89] found id: ""
	I1218 00:45:46.835231 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.835237 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:46.835242 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:46.835300 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:46.859864 1201669 cri.go:89] found id: ""
	I1218 00:45:46.859877 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.859891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:46.859899 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:46.859910 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.887041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:46.887057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:46.953500 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:46.953519 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:46.968086 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:46.968102 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:47.030071 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:47.030081 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:47.030091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.602443 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:49.612708 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:49.612770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:49.638886 1201669 cri.go:89] found id: ""
	I1218 00:45:49.638900 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.638907 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:49.638912 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:49.638969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:49.666118 1201669 cri.go:89] found id: ""
	I1218 00:45:49.666132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.666139 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:49.666145 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:49.666205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:49.695529 1201669 cri.go:89] found id: ""
	I1218 00:45:49.695542 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.695549 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:49.695554 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:49.695609 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:49.718430 1201669 cri.go:89] found id: ""
	I1218 00:45:49.718444 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.718451 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:49.718457 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:49.718514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:49.742944 1201669 cri.go:89] found id: ""
	I1218 00:45:49.742957 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.742964 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:49.742969 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:49.743028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:49.767863 1201669 cri.go:89] found id: ""
	I1218 00:45:49.767876 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.767888 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:49.767894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:49.767949 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:49.792207 1201669 cri.go:89] found id: ""
	I1218 00:45:49.792254 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.792261 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:49.792269 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:49.792279 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:49.806632 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:49.806655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:49.869094 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:49.869105 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:49.869130 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.936480 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:49.936498 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:49.965414 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:49.965430 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.533961 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:52.543970 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:52.544028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:52.569650 1201669 cri.go:89] found id: ""
	I1218 00:45:52.569663 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.569671 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:52.569676 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:52.569735 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:52.593935 1201669 cri.go:89] found id: ""
	I1218 00:45:52.593949 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.593955 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:52.593961 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:52.594019 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:52.618968 1201669 cri.go:89] found id: ""
	I1218 00:45:52.618982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.618989 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:52.618994 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:52.619051 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:52.647696 1201669 cri.go:89] found id: ""
	I1218 00:45:52.647710 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.647717 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:52.647728 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:52.647787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:52.675609 1201669 cri.go:89] found id: ""
	I1218 00:45:52.675622 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.675629 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:52.675634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:52.675690 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:52.701982 1201669 cri.go:89] found id: ""
	I1218 00:45:52.701995 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.702001 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:52.702007 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:52.702064 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:52.727053 1201669 cri.go:89] found id: ""
	I1218 00:45:52.727066 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.727073 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:52.727081 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:52.727091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.793606 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:52.793626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:52.807921 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:52.807938 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:52.871908 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:52.871918 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:52.871942 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:52.939995 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:52.940015 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.467573 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:55.477751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:55.477808 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:55.503215 1201669 cri.go:89] found id: ""
	I1218 00:45:55.503229 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.503235 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:55.503241 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:55.503299 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:55.528321 1201669 cri.go:89] found id: ""
	I1218 00:45:55.528334 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.528341 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:55.528346 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:55.528406 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:55.555566 1201669 cri.go:89] found id: ""
	I1218 00:45:55.555580 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.555586 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:55.555591 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:55.555659 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:55.580858 1201669 cri.go:89] found id: ""
	I1218 00:45:55.580870 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.580877 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:55.580882 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:55.580941 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:55.609703 1201669 cri.go:89] found id: ""
	I1218 00:45:55.609717 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.609724 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:55.609729 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:55.609792 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:55.635271 1201669 cri.go:89] found id: ""
	I1218 00:45:55.635285 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.635301 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:55.635307 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:55.635379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:55.664174 1201669 cri.go:89] found id: ""
	I1218 00:45:55.664188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.664203 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:55.664211 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:55.664247 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:55.678574 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:55.678597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:55.741880 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:55.741890 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:55.741900 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:55.814783 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:55.814804 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.845128 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:55.845151 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.416331 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:58.426299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:58.426355 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:58.457684 1201669 cri.go:89] found id: ""
	I1218 00:45:58.457698 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.457705 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:58.457710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:58.457769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:58.482307 1201669 cri.go:89] found id: ""
	I1218 00:45:58.482320 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.482327 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:58.482332 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:58.482389 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:58.507442 1201669 cri.go:89] found id: ""
	I1218 00:45:58.507454 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.507461 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:58.507466 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:58.507523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:58.536949 1201669 cri.go:89] found id: ""
	I1218 00:45:58.536963 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.536969 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:58.536974 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:58.537030 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:58.565233 1201669 cri.go:89] found id: ""
	I1218 00:45:58.565246 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.565253 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:58.565257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:58.565313 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:58.589568 1201669 cri.go:89] found id: ""
	I1218 00:45:58.589582 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.589589 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:58.589594 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:58.589655 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:58.613117 1201669 cri.go:89] found id: ""
	I1218 00:45:58.613130 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.613137 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:58.613145 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:58.613156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:58.681549 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:58.681572 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:58.709658 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:58.709678 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.778632 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:58.778651 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:58.793209 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:58.793225 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:58.857093 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.358084 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:01.368502 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:01.368561 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:01.399458 1201669 cri.go:89] found id: ""
	I1218 00:46:01.399490 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.399498 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:01.399504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:01.399589 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:01.429331 1201669 cri.go:89] found id: ""
	I1218 00:46:01.429346 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.429353 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:01.429359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:01.429418 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:01.463764 1201669 cri.go:89] found id: ""
	I1218 00:46:01.463777 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.463784 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:01.463792 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:01.463852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:01.490438 1201669 cri.go:89] found id: ""
	I1218 00:46:01.490451 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.490458 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:01.490464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:01.490523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:01.515150 1201669 cri.go:89] found id: ""
	I1218 00:46:01.515163 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.515170 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:01.515176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:01.515238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:01.541480 1201669 cri.go:89] found id: ""
	I1218 00:46:01.541494 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.541501 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:01.541507 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:01.541567 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:01.566788 1201669 cri.go:89] found id: ""
	I1218 00:46:01.566802 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.566809 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:01.566817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:01.566827 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:01.630909 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.630919 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:01.630929 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:01.699339 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:01.699360 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:01.730198 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:01.730213 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:01.798536 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:01.798555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:04.314812 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:04.325258 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:04.325319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:04.350282 1201669 cri.go:89] found id: ""
	I1218 00:46:04.350302 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.350309 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:04.350314 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:04.350374 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:04.375290 1201669 cri.go:89] found id: ""
	I1218 00:46:04.375305 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.375311 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:04.375316 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:04.375381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:04.410898 1201669 cri.go:89] found id: ""
	I1218 00:46:04.410911 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.410918 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:04.410923 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:04.410980 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:04.448128 1201669 cri.go:89] found id: ""
	I1218 00:46:04.448141 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.448151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:04.448156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:04.448214 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:04.478635 1201669 cri.go:89] found id: ""
	I1218 00:46:04.478648 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.478655 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:04.478660 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:04.478718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:04.504261 1201669 cri.go:89] found id: ""
	I1218 00:46:04.504275 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.504282 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:04.504288 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:04.504345 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:04.529823 1201669 cri.go:89] found id: ""
	I1218 00:46:04.529836 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.529843 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:04.529851 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:04.529862 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:04.595056 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:04.595066 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:04.595076 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:04.665580 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:04.665600 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:04.695540 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:04.695555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:04.766700 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:04.766721 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.281438 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:07.291184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:07.291241 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:07.318270 1201669 cri.go:89] found id: ""
	I1218 00:46:07.318283 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.318290 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:07.318295 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:07.318353 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:07.342684 1201669 cri.go:89] found id: ""
	I1218 00:46:07.342697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.342704 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:07.342718 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:07.342777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:07.367159 1201669 cri.go:89] found id: ""
	I1218 00:46:07.367173 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.367180 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:07.367186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:07.367252 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:07.399917 1201669 cri.go:89] found id: ""
	I1218 00:46:07.399942 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.399949 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:07.399954 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:07.400025 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:07.428891 1201669 cri.go:89] found id: ""
	I1218 00:46:07.428904 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.428911 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:07.428918 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:07.428988 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:07.461232 1201669 cri.go:89] found id: ""
	I1218 00:46:07.461244 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.461251 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:07.461257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:07.461319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:07.487577 1201669 cri.go:89] found id: ""
	I1218 00:46:07.487590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.487607 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:07.487616 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:07.487626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:07.554637 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:07.554656 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.570064 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:07.570080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:07.635097 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:07.635107 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:07.635118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:07.706762 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:07.706782 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.235305 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:10.245498 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:10.245568 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:10.275954 1201669 cri.go:89] found id: ""
	I1218 00:46:10.275965 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.275972 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:10.275985 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:10.276042 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:10.301377 1201669 cri.go:89] found id: ""
	I1218 00:46:10.301391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.301397 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:10.301402 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:10.301468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:10.327075 1201669 cri.go:89] found id: ""
	I1218 00:46:10.327089 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.327096 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:10.327101 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:10.327163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:10.355039 1201669 cri.go:89] found id: ""
	I1218 00:46:10.355052 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.355059 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:10.355064 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:10.355126 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:10.380800 1201669 cri.go:89] found id: ""
	I1218 00:46:10.380814 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.380821 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:10.380826 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:10.380883 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:10.420766 1201669 cri.go:89] found id: ""
	I1218 00:46:10.420781 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.420788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:10.420794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:10.420852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:10.450993 1201669 cri.go:89] found id: ""
	I1218 00:46:10.451006 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.451013 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:10.451021 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:10.451031 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:10.469649 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:10.469664 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:10.534853 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:10.534862 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:10.534873 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:10.603061 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:10.603080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.634944 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:10.634961 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.201986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:13.212552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:13.212611 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:13.236454 1201669 cri.go:89] found id: ""
	I1218 00:46:13.236468 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.236475 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:13.236481 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:13.236542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:13.261394 1201669 cri.go:89] found id: ""
	I1218 00:46:13.261408 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.261415 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:13.261420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:13.261479 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:13.286366 1201669 cri.go:89] found id: ""
	I1218 00:46:13.286380 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.286393 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:13.286398 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:13.286457 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:13.311045 1201669 cri.go:89] found id: ""
	I1218 00:46:13.311058 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.311065 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:13.311070 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:13.311132 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:13.336414 1201669 cri.go:89] found id: ""
	I1218 00:46:13.336427 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.336434 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:13.336439 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:13.336503 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:13.366089 1201669 cri.go:89] found id: ""
	I1218 00:46:13.366102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.366109 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:13.366114 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:13.366170 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:13.398167 1201669 cri.go:89] found id: ""
	I1218 00:46:13.398180 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.398187 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:13.398195 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:13.398205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.472148 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:13.472173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:13.487248 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:13.487267 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:13.552950 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:13.552960 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:13.552973 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:13.622039 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:13.622058 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:16.149384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:16.159725 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:16.159786 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:16.186967 1201669 cri.go:89] found id: ""
	I1218 00:46:16.186981 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.186988 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:16.186993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:16.187052 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:16.213347 1201669 cri.go:89] found id: ""
	I1218 00:46:16.213361 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.213368 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:16.213374 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:16.213431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:16.239666 1201669 cri.go:89] found id: ""
	I1218 00:46:16.239679 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.239686 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:16.239692 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:16.239747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:16.264667 1201669 cri.go:89] found id: ""
	I1218 00:46:16.264680 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.264686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:16.264691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:16.264747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:16.290913 1201669 cri.go:89] found id: ""
	I1218 00:46:16.290925 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.290932 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:16.290937 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:16.290995 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:16.318436 1201669 cri.go:89] found id: ""
	I1218 00:46:16.318449 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.318458 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:16.318464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:16.318522 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:16.344303 1201669 cri.go:89] found id: ""
	I1218 00:46:16.344316 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.344323 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:16.344331 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:16.344342 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:16.411796 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:16.411814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:16.427899 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:16.427916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:16.499022 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:16.499032 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:16.499042 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:16.568931 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:16.568951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.102749 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:19.112504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:19.112560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:19.140375 1201669 cri.go:89] found id: ""
	I1218 00:46:19.140389 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.140396 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:19.140401 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:19.140462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:19.170806 1201669 cri.go:89] found id: ""
	I1218 00:46:19.170832 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.170840 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:19.170848 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:19.170930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:19.202879 1201669 cri.go:89] found id: ""
	I1218 00:46:19.202894 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.202901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:19.202907 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:19.202973 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:19.226832 1201669 cri.go:89] found id: ""
	I1218 00:46:19.226844 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.226851 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:19.226856 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:19.226913 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:19.251251 1201669 cri.go:89] found id: ""
	I1218 00:46:19.251264 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.251271 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:19.251277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:19.251334 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:19.275051 1201669 cri.go:89] found id: ""
	I1218 00:46:19.275064 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.275071 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:19.275080 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:19.275138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:19.303255 1201669 cri.go:89] found id: ""
	I1218 00:46:19.303268 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.303291 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:19.303299 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:19.303309 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.332819 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:19.332836 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:19.398262 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:19.398281 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:19.413015 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:19.413030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:19.483412 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:19.483423 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:19.483475 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:22.052118 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:22.062390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:22.062454 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:22.091903 1201669 cri.go:89] found id: ""
	I1218 00:46:22.091917 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.091924 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:22.091930 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:22.091987 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:22.116458 1201669 cri.go:89] found id: ""
	I1218 00:46:22.116471 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.116478 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:22.116483 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:22.116560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:22.142090 1201669 cri.go:89] found id: ""
	I1218 00:46:22.142102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.142109 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:22.142115 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:22.142180 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:22.166148 1201669 cri.go:89] found id: ""
	I1218 00:46:22.166162 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.166169 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:22.166175 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:22.166234 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:22.191864 1201669 cri.go:89] found id: ""
	I1218 00:46:22.191877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.191884 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:22.191890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:22.191953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:22.216176 1201669 cri.go:89] found id: ""
	I1218 00:46:22.216190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.216197 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:22.216202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:22.216283 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:22.240865 1201669 cri.go:89] found id: ""
	I1218 00:46:22.240878 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.240891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:22.240898 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:22.240908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:22.269665 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:22.269688 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:22.334885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:22.334903 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:22.349240 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:22.349256 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:22.424972 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:22.424982 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:22.425001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.004463 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:25.015873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:25.015934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:25.043533 1201669 cri.go:89] found id: ""
	I1218 00:46:25.043547 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.043558 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:25.043563 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:25.043630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:25.070860 1201669 cri.go:89] found id: ""
	I1218 00:46:25.070874 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.070881 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:25.070887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:25.070945 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:25.100326 1201669 cri.go:89] found id: ""
	I1218 00:46:25.100340 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.100349 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:25.100356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:25.100420 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:25.127292 1201669 cri.go:89] found id: ""
	I1218 00:46:25.127306 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.127313 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:25.127318 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:25.127376 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:25.152929 1201669 cri.go:89] found id: ""
	I1218 00:46:25.152943 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.152950 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:25.152955 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:25.153023 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:25.179602 1201669 cri.go:89] found id: ""
	I1218 00:46:25.179622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.179629 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:25.179634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:25.179691 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:25.204777 1201669 cri.go:89] found id: ""
	I1218 00:46:25.204790 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.204797 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:25.204804 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:25.204814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.274359 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:25.274379 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:25.305207 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:25.305224 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:25.375922 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:25.375941 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:25.392181 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:25.392196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:25.470714 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:27.970992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:27.980972 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:27.981029 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:28.007728 1201669 cri.go:89] found id: ""
	I1218 00:46:28.007744 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.007752 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:28.007758 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:28.007821 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:28.038973 1201669 cri.go:89] found id: ""
	I1218 00:46:28.038987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.038995 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:28.039000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:28.039063 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:28.066609 1201669 cri.go:89] found id: ""
	I1218 00:46:28.066622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.066629 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:28.066634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:28.066695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:28.092484 1201669 cri.go:89] found id: ""
	I1218 00:46:28.092498 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.092506 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:28.092512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:28.092583 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:28.119611 1201669 cri.go:89] found id: ""
	I1218 00:46:28.119625 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.119632 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:28.119638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:28.119698 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:28.145154 1201669 cri.go:89] found id: ""
	I1218 00:46:28.145167 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.145175 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:28.145180 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:28.145238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:28.170178 1201669 cri.go:89] found id: ""
	I1218 00:46:28.170191 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.170198 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:28.170206 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:28.170216 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:28.235805 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:28.235824 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:28.250608 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:28.250629 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:28.314678 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:28.314687 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:28.314698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:28.383399 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:28.383420 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:30.924810 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:30.935068 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:30.935128 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:30.960550 1201669 cri.go:89] found id: ""
	I1218 00:46:30.960563 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.960570 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:30.960575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:30.960636 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:30.985705 1201669 cri.go:89] found id: ""
	I1218 00:46:30.985718 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.985725 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:30.985730 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:30.985787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:31.011725 1201669 cri.go:89] found id: ""
	I1218 00:46:31.011739 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.011746 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:31.011751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:31.011813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:31.038735 1201669 cri.go:89] found id: ""
	I1218 00:46:31.038748 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.038755 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:31.038760 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:31.038822 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:31.062623 1201669 cri.go:89] found id: ""
	I1218 00:46:31.062637 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.062645 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:31.062651 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:31.062716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:31.089339 1201669 cri.go:89] found id: ""
	I1218 00:46:31.089353 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.089366 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:31.089372 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:31.089431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:31.119659 1201669 cri.go:89] found id: ""
	I1218 00:46:31.119672 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.119679 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:31.119687 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:31.119698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:31.185677 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:31.185697 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:31.200077 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:31.200092 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:31.263573 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:31.263582 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:31.263593 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:31.331836 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:31.331857 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:33.859870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:33.871250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:33.871309 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:33.899076 1201669 cri.go:89] found id: ""
	I1218 00:46:33.899090 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.899097 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:33.899103 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:33.899163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:33.927937 1201669 cri.go:89] found id: ""
	I1218 00:46:33.927955 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.927961 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:33.927967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:33.928024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:33.954257 1201669 cri.go:89] found id: ""
	I1218 00:46:33.954271 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.954278 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:33.954283 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:33.954339 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:33.978840 1201669 cri.go:89] found id: ""
	I1218 00:46:33.978853 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.978860 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:33.978865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:33.978921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:34.008172 1201669 cri.go:89] found id: ""
	I1218 00:46:34.008186 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.008193 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:34.008198 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:34.008296 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:34.038029 1201669 cri.go:89] found id: ""
	I1218 00:46:34.038043 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.038050 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:34.038057 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:34.038116 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:34.067280 1201669 cri.go:89] found id: ""
	I1218 00:46:34.067294 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.067302 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:34.067311 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:34.067321 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:34.099533 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:34.099549 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:34.165421 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:34.165442 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:34.179966 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:34.179981 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:34.243670 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:34.243681 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:34.243694 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:36.812424 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:36.822427 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:36.822486 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:36.847846 1201669 cri.go:89] found id: ""
	I1218 00:46:36.847859 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.847866 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:36.847872 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:36.847927 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:36.873323 1201669 cri.go:89] found id: ""
	I1218 00:46:36.873337 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.873344 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:36.873349 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:36.873408 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:36.898528 1201669 cri.go:89] found id: ""
	I1218 00:46:36.898541 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.898547 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:36.898553 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:36.898608 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:36.925176 1201669 cri.go:89] found id: ""
	I1218 00:46:36.925190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.925197 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:36.925202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:36.925260 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:36.954449 1201669 cri.go:89] found id: ""
	I1218 00:46:36.954463 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.954469 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:36.954474 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:36.954533 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:36.978226 1201669 cri.go:89] found id: ""
	I1218 00:46:36.978239 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.978246 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:36.978251 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:36.978308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:37.005731 1201669 cri.go:89] found id: ""
	I1218 00:46:37.005747 1201669 logs.go:282] 0 containers: []
	W1218 00:46:37.005755 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:37.005764 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:37.005776 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:37.026584 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:37.026606 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:37.089657 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:37.089672 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:37.089683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:37.161954 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:37.161980 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:37.189136 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:37.189155 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:39.765929 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:39.776452 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:39.776510 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:39.801519 1201669 cri.go:89] found id: ""
	I1218 00:46:39.801532 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.801539 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:39.801544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:39.801604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:39.829201 1201669 cri.go:89] found id: ""
	I1218 00:46:39.829215 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.829222 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:39.829226 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:39.829287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:39.854274 1201669 cri.go:89] found id: ""
	I1218 00:46:39.854287 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.854294 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:39.854299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:39.854357 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:39.879811 1201669 cri.go:89] found id: ""
	I1218 00:46:39.879824 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.879831 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:39.879836 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:39.879893 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:39.912296 1201669 cri.go:89] found id: ""
	I1218 00:46:39.912310 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.912317 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:39.912322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:39.912380 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:39.939288 1201669 cri.go:89] found id: ""
	I1218 00:46:39.939313 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.939321 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:39.939326 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:39.939393 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:39.967012 1201669 cri.go:89] found id: ""
	I1218 00:46:39.967027 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.967034 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:39.967041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:39.967051 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:40.033896 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:40.033919 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:40.052546 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:40.052564 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:40.123489 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:40.123524 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:40.123537 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:40.195140 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:40.195161 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:42.731664 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:42.741511 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:42.741573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:42.765856 1201669 cri.go:89] found id: ""
	I1218 00:46:42.765869 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.765876 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:42.765881 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:42.765947 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:42.790000 1201669 cri.go:89] found id: ""
	I1218 00:46:42.790013 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.790020 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:42.790025 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:42.790080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:42.814497 1201669 cri.go:89] found id: ""
	I1218 00:46:42.814511 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.814518 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:42.814523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:42.814580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:42.839923 1201669 cri.go:89] found id: ""
	I1218 00:46:42.839937 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.839943 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:42.839948 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:42.840009 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:42.866771 1201669 cri.go:89] found id: ""
	I1218 00:46:42.866784 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.866791 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:42.866798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:42.866856 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:42.894391 1201669 cri.go:89] found id: ""
	I1218 00:46:42.894404 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.894411 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:42.894416 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:42.894481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:42.919369 1201669 cri.go:89] found id: ""
	I1218 00:46:42.919391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.919399 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:42.919408 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:42.919419 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:42.934812 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:42.934829 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:42.998153 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:42.998162 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:42.998173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:43.067475 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:43.067494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:43.097319 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:43.097335 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:45.664349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:45.675110 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:45.675171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:45.707428 1201669 cri.go:89] found id: ""
	I1218 00:46:45.707442 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.707449 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:45.707454 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:45.707512 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:45.732673 1201669 cri.go:89] found id: ""
	I1218 00:46:45.732687 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.732694 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:45.732700 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:45.732759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:45.756652 1201669 cri.go:89] found id: ""
	I1218 00:46:45.756666 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.756673 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:45.756679 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:45.756741 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:45.781416 1201669 cri.go:89] found id: ""
	I1218 00:46:45.781430 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.781437 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:45.781442 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:45.781498 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:45.806268 1201669 cri.go:89] found id: ""
	I1218 00:46:45.806281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.806288 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:45.806294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:45.806363 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:45.831015 1201669 cri.go:89] found id: ""
	I1218 00:46:45.831028 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.831035 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:45.831040 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:45.831098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:45.855951 1201669 cri.go:89] found id: ""
	I1218 00:46:45.855964 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.855970 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:45.855978 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:45.855988 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:45.870419 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:45.870436 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:45.934620 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:45.934630 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:45.934641 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:46.007377 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:46.007400 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:46.038285 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:46.038302 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.604685 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:48.614701 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:48.614759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:48.640971 1201669 cri.go:89] found id: ""
	I1218 00:46:48.640984 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.640991 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:48.640997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:48.641055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:48.670241 1201669 cri.go:89] found id: ""
	I1218 00:46:48.670254 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.670261 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:48.670266 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:48.670324 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:48.714267 1201669 cri.go:89] found id: ""
	I1218 00:46:48.714281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.714288 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:48.714294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:48.714359 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:48.743058 1201669 cri.go:89] found id: ""
	I1218 00:46:48.743071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.743077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:48.743083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:48.743146 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:48.768865 1201669 cri.go:89] found id: ""
	I1218 00:46:48.768877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.768885 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:48.768890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:48.768950 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:48.794057 1201669 cri.go:89] found id: ""
	I1218 00:46:48.794071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.794078 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:48.794083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:48.794139 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:48.824069 1201669 cri.go:89] found id: ""
	I1218 00:46:48.824082 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.824090 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:48.824102 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:48.824112 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.893155 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:48.893176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:48.908605 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:48.908621 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:48.974531 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:48.974541 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:48.974551 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:49.047912 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:49.047931 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.578760 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:51.588638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:51.588697 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:51.620629 1201669 cri.go:89] found id: ""
	I1218 00:46:51.620643 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.620649 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:51.620661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:51.620737 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:51.653267 1201669 cri.go:89] found id: ""
	I1218 00:46:51.653281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.653297 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:51.653302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:51.653372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:51.680215 1201669 cri.go:89] found id: ""
	I1218 00:46:51.680250 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.680257 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:51.680263 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:51.680328 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:51.712435 1201669 cri.go:89] found id: ""
	I1218 00:46:51.712448 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.712455 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:51.712460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:51.712525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:51.740973 1201669 cri.go:89] found id: ""
	I1218 00:46:51.740987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.740994 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:51.741000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:51.741057 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:51.765683 1201669 cri.go:89] found id: ""
	I1218 00:46:51.765697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.765704 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:51.765710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:51.765767 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:51.793065 1201669 cri.go:89] found id: ""
	I1218 00:46:51.793080 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.793088 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:51.793095 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:51.793106 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:51.807847 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:51.807863 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:51.870944 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:51.870953 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:51.870964 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:51.939037 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:51.939057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.973517 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:51.973532 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:54.540109 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:54.550150 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:54.550216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:54.579006 1201669 cri.go:89] found id: ""
	I1218 00:46:54.579019 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.579026 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:54.579031 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:54.579088 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:54.609045 1201669 cri.go:89] found id: ""
	I1218 00:46:54.609059 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.609066 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:54.609071 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:54.609130 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:54.640693 1201669 cri.go:89] found id: ""
	I1218 00:46:54.640707 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.640714 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:54.640720 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:54.640777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:54.674577 1201669 cri.go:89] found id: ""
	I1218 00:46:54.674590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.674597 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:54.674603 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:54.674658 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:54.709862 1201669 cri.go:89] found id: ""
	I1218 00:46:54.709875 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.709882 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:54.709887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:54.709946 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:54.735151 1201669 cri.go:89] found id: ""
	I1218 00:46:54.735165 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.735171 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:54.735177 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:54.735237 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:54.762946 1201669 cri.go:89] found id: ""
	I1218 00:46:54.762960 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.762966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:54.762974 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:54.762984 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:54.778250 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:54.778266 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:54.841698 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:54.841707 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:54.841718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:54.909164 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:54.909183 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:54.946219 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:54.946236 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.515189 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:57.525323 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:57.525384 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:57.550694 1201669 cri.go:89] found id: ""
	I1218 00:46:57.550708 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.550716 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:57.550721 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:57.550782 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:57.578567 1201669 cri.go:89] found id: ""
	I1218 00:46:57.578582 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.578590 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:57.578595 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:57.578656 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:57.604092 1201669 cri.go:89] found id: ""
	I1218 00:46:57.604105 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.604112 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:57.604120 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:57.604178 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:57.628719 1201669 cri.go:89] found id: ""
	I1218 00:46:57.628733 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.628739 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:57.628744 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:57.628806 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:57.666872 1201669 cri.go:89] found id: ""
	I1218 00:46:57.666885 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.666892 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:57.666897 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:57.666954 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:57.703636 1201669 cri.go:89] found id: ""
	I1218 00:46:57.703649 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.703656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:57.703661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:57.703721 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:57.729878 1201669 cri.go:89] found id: ""
	I1218 00:46:57.729891 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.729898 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:57.729905 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:57.729916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.793892 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:57.793911 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:57.808664 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:57.808680 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:57.871552 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:57.871570 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:57.871582 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:57.939629 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:57.939649 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:00.470791 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:00.480890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:00.480955 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:00.510265 1201669 cri.go:89] found id: ""
	I1218 00:47:00.510278 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.510285 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:00.510290 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:00.510349 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:00.534908 1201669 cri.go:89] found id: ""
	I1218 00:47:00.534922 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.534929 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:00.534934 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:00.534992 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:00.559619 1201669 cri.go:89] found id: ""
	I1218 00:47:00.559632 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.559639 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:00.559644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:00.559705 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:00.587698 1201669 cri.go:89] found id: ""
	I1218 00:47:00.587711 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.587719 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:00.587724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:00.587781 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:00.611884 1201669 cri.go:89] found id: ""
	I1218 00:47:00.611897 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.611904 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:00.611909 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:00.611974 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:00.640874 1201669 cri.go:89] found id: ""
	I1218 00:47:00.640888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.640895 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:00.640900 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:00.640965 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:00.674185 1201669 cri.go:89] found id: ""
	I1218 00:47:00.674198 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.674205 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:00.674213 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:00.674223 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:00.750327 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:00.750347 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:00.765877 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:00.765899 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:00.831441 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:00.831450 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:00.831462 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:00.899398 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:00.899423 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:03.427398 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:03.437572 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:03.437634 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:03.466927 1201669 cri.go:89] found id: ""
	I1218 00:47:03.466940 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.466948 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:03.466952 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:03.467011 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:03.492647 1201669 cri.go:89] found id: ""
	I1218 00:47:03.492661 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.492668 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:03.492672 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:03.492729 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:03.522689 1201669 cri.go:89] found id: ""
	I1218 00:47:03.522702 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.522709 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:03.522714 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:03.522774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:03.547665 1201669 cri.go:89] found id: ""
	I1218 00:47:03.547679 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.547686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:03.547691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:03.547754 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:03.573125 1201669 cri.go:89] found id: ""
	I1218 00:47:03.573139 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.573146 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:03.573151 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:03.573209 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:03.598799 1201669 cri.go:89] found id: ""
	I1218 00:47:03.598812 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.598819 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:03.598825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:03.598882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:03.622999 1201669 cri.go:89] found id: ""
	I1218 00:47:03.623013 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.623019 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:03.623027 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:03.623037 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:03.697686 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:03.697703 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:03.715817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:03.715833 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:03.782593 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:03.782603 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:03.782616 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:03.850592 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:03.850611 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:06.381230 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:06.390993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:06.391053 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:06.414602 1201669 cri.go:89] found id: ""
	I1218 00:47:06.414616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.414622 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:06.414628 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:06.414684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:06.438729 1201669 cri.go:89] found id: ""
	I1218 00:47:06.438743 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.438750 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:06.438755 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:06.438820 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:06.463196 1201669 cri.go:89] found id: ""
	I1218 00:47:06.463208 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.463215 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:06.463220 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:06.463275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:06.488161 1201669 cri.go:89] found id: ""
	I1218 00:47:06.488174 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.488181 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:06.488186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:06.488275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:06.517546 1201669 cri.go:89] found id: ""
	I1218 00:47:06.517559 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.517566 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:06.517571 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:06.517630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:06.541811 1201669 cri.go:89] found id: ""
	I1218 00:47:06.541825 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.541831 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:06.541837 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:06.541894 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:06.565470 1201669 cri.go:89] found id: ""
	I1218 00:47:06.565483 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.565491 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:06.565501 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:06.565511 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:06.630810 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:06.630828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:06.650036 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:06.650061 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:06.735359 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:06.735369 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:06.735382 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:06.804427 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:06.804447 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.337315 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:09.347711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:09.347770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:09.373796 1201669 cri.go:89] found id: ""
	I1218 00:47:09.373809 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.373817 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:09.373823 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:09.373887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:09.398745 1201669 cri.go:89] found id: ""
	I1218 00:47:09.398759 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.398766 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:09.398783 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:09.398850 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:09.424602 1201669 cri.go:89] found id: ""
	I1218 00:47:09.424616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.424623 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:09.424630 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:09.424687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:09.453853 1201669 cri.go:89] found id: ""
	I1218 00:47:09.453866 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.453873 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:09.453879 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:09.453934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:09.482334 1201669 cri.go:89] found id: ""
	I1218 00:47:09.482348 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.482355 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:09.482360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:09.482415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:09.514905 1201669 cri.go:89] found id: ""
	I1218 00:47:09.514928 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.514935 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:09.514941 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:09.515006 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:09.538866 1201669 cri.go:89] found id: ""
	I1218 00:47:09.538888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.538895 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:09.538903 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:09.538913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:09.553496 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:09.553516 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:09.615452 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:09.615461 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:09.615472 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:09.683616 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:09.683638 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.715893 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:09.715908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.282722 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:12.292327 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:12.292388 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:12.317025 1201669 cri.go:89] found id: ""
	I1218 00:47:12.317039 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.317045 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:12.317050 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:12.317106 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:12.341477 1201669 cri.go:89] found id: ""
	I1218 00:47:12.341490 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.341497 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:12.341501 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:12.341556 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:12.365784 1201669 cri.go:89] found id: ""
	I1218 00:47:12.365798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.365805 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:12.365810 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:12.365870 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:12.394874 1201669 cri.go:89] found id: ""
	I1218 00:47:12.394887 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.394894 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:12.394899 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:12.394958 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:12.419496 1201669 cri.go:89] found id: ""
	I1218 00:47:12.419509 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.419516 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:12.419521 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:12.419577 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:12.444379 1201669 cri.go:89] found id: ""
	I1218 00:47:12.444393 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.444399 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:12.444414 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:12.444470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:12.468918 1201669 cri.go:89] found id: ""
	I1218 00:47:12.468931 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.468939 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:12.468946 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:12.468960 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:12.537486 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:12.537505 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:12.568974 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:12.568990 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.635070 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:12.635089 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:12.652372 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:12.652388 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:12.728630 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:15.228895 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:15.239250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:15.239307 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:15.264983 1201669 cri.go:89] found id: ""
	I1218 00:47:15.264996 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.265003 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:15.265009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:15.265070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:15.293517 1201669 cri.go:89] found id: ""
	I1218 00:47:15.293531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.293537 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:15.293542 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:15.293599 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:15.319218 1201669 cri.go:89] found id: ""
	I1218 00:47:15.319231 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.319238 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:15.319243 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:15.319298 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:15.344396 1201669 cri.go:89] found id: ""
	I1218 00:47:15.344410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.344417 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:15.344422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:15.344481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:15.373243 1201669 cri.go:89] found id: ""
	I1218 00:47:15.373256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.373263 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:15.373268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:15.373329 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:15.397807 1201669 cri.go:89] found id: ""
	I1218 00:47:15.397820 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.397827 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:15.397832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:15.397887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:15.422535 1201669 cri.go:89] found id: ""
	I1218 00:47:15.422549 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.422557 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:15.422564 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:15.422574 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:15.490575 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:15.490595 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:15.521157 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:15.521176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:15.592728 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:15.592747 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:15.607949 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:15.607965 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:15.688565 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.190283 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:18.200009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:18.200073 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:18.224427 1201669 cri.go:89] found id: ""
	I1218 00:47:18.224440 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.224447 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:18.224453 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:18.224514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:18.248627 1201669 cri.go:89] found id: ""
	I1218 00:47:18.248641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.248648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:18.248653 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:18.248711 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:18.275672 1201669 cri.go:89] found id: ""
	I1218 00:47:18.275690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.275703 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:18.275709 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:18.275766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:18.302626 1201669 cri.go:89] found id: ""
	I1218 00:47:18.302640 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.302656 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:18.302661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:18.302716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:18.328772 1201669 cri.go:89] found id: ""
	I1218 00:47:18.328785 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.328792 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:18.328797 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:18.328852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:18.354242 1201669 cri.go:89] found id: ""
	I1218 00:47:18.354256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.354263 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:18.354268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:18.354332 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:18.378135 1201669 cri.go:89] found id: ""
	I1218 00:47:18.378148 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.378157 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:18.378165 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:18.378175 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:18.443885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:18.443904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:18.458116 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:18.458135 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:18.520486 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.520496 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:18.520507 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:18.586967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:18.586986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:21.118235 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:21.128015 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:21.128072 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:21.153715 1201669 cri.go:89] found id: ""
	I1218 00:47:21.153729 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.153736 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:21.153742 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:21.153803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:21.183062 1201669 cri.go:89] found id: ""
	I1218 00:47:21.183075 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.183082 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:21.183087 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:21.183144 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:21.210382 1201669 cri.go:89] found id: ""
	I1218 00:47:21.210396 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.210402 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:21.210407 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:21.210462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:21.235561 1201669 cri.go:89] found id: ""
	I1218 00:47:21.235575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.235582 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:21.235587 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:21.235684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:21.261486 1201669 cri.go:89] found id: ""
	I1218 00:47:21.261500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.261507 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:21.261512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:21.261571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:21.286687 1201669 cri.go:89] found id: ""
	I1218 00:47:21.286701 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.286708 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:21.286713 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:21.286770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:21.312639 1201669 cri.go:89] found id: ""
	I1218 00:47:21.312656 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.312663 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:21.312671 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:21.312682 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:21.377475 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:21.377494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:21.394148 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:21.394166 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:21.461525 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:21.461535 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:21.461546 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:21.529823 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:21.529841 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.060601 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:24.071009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:24.071080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:24.098379 1201669 cri.go:89] found id: ""
	I1218 00:47:24.098392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.098399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:24.098406 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:24.098520 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:24.125402 1201669 cri.go:89] found id: ""
	I1218 00:47:24.125416 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.125423 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:24.125428 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:24.125487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:24.151397 1201669 cri.go:89] found id: ""
	I1218 00:47:24.151410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.151417 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:24.151422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:24.151485 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:24.178459 1201669 cri.go:89] found id: ""
	I1218 00:47:24.178473 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.178480 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:24.178485 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:24.178542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:24.204162 1201669 cri.go:89] found id: ""
	I1218 00:47:24.204175 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.204182 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:24.204188 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:24.204282 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:24.232955 1201669 cri.go:89] found id: ""
	I1218 00:47:24.232969 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.232977 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:24.232982 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:24.233043 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:24.258828 1201669 cri.go:89] found id: ""
	I1218 00:47:24.258841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.258848 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:24.258856 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:24.258867 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.285593 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:24.285609 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:24.352328 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:24.352348 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:24.367078 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:24.367095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:24.430867 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:24.430877 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:24.430887 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.002647 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:27.013860 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:27.013930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:27.042334 1201669 cri.go:89] found id: ""
	I1218 00:47:27.042347 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.042354 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:27.042360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:27.042419 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:27.066697 1201669 cri.go:89] found id: ""
	I1218 00:47:27.066710 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.066717 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:27.066722 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:27.066777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:27.094998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.095011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.095018 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:27.095024 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:27.095081 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:27.122504 1201669 cri.go:89] found id: ""
	I1218 00:47:27.122518 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.122525 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:27.122530 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:27.122587 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:27.147998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.148011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.148018 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:27.148023 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:27.148093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:27.172132 1201669 cri.go:89] found id: ""
	I1218 00:47:27.172149 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.172156 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:27.172161 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:27.172253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:27.197418 1201669 cri.go:89] found id: ""
	I1218 00:47:27.197431 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.197438 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:27.197445 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:27.197455 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:27.263570 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:27.263588 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:27.278312 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:27.278327 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:27.342448 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:27.342458 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:27.342469 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.410881 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:27.410901 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:29.944358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:29.954644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:29.954701 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:29.978633 1201669 cri.go:89] found id: ""
	I1218 00:47:29.978647 1201669 logs.go:282] 0 containers: []
	W1218 00:47:29.978654 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:29.978659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:29.978717 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:30.009832 1201669 cri.go:89] found id: ""
	I1218 00:47:30.009850 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.009858 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:30.009864 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:30.009938 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:30.040840 1201669 cri.go:89] found id: ""
	I1218 00:47:30.040858 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.040867 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:30.040876 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:30.040952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:30.068318 1201669 cri.go:89] found id: ""
	I1218 00:47:30.068332 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.068339 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:30.068344 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:30.068407 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:30.094562 1201669 cri.go:89] found id: ""
	I1218 00:47:30.094577 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.094584 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:30.094589 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:30.094650 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:30.121388 1201669 cri.go:89] found id: ""
	I1218 00:47:30.121402 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.121409 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:30.121415 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:30.121472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:30.149519 1201669 cri.go:89] found id: ""
	I1218 00:47:30.149533 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.149540 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:30.149550 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:30.149565 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:30.177089 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:30.177107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:30.242748 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:30.242767 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:30.257468 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:30.257483 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:30.320728 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:30.320738 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:30.320749 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:32.889870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:32.900811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:32.900868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:32.927540 1201669 cri.go:89] found id: ""
	I1218 00:47:32.927553 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.927560 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:32.927565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:32.927622 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:32.955598 1201669 cri.go:89] found id: ""
	I1218 00:47:32.955611 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.955619 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:32.955623 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:32.955695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:32.979141 1201669 cri.go:89] found id: ""
	I1218 00:47:32.979155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.979162 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:32.979167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:32.979224 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:33.006203 1201669 cri.go:89] found id: ""
	I1218 00:47:33.006218 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.006225 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:33.006230 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:33.006294 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:33.034661 1201669 cri.go:89] found id: ""
	I1218 00:47:33.034675 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.034691 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:33.034697 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:33.034756 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:33.062772 1201669 cri.go:89] found id: ""
	I1218 00:47:33.062786 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.062793 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:33.062804 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:33.062869 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:33.086825 1201669 cri.go:89] found id: ""
	I1218 00:47:33.086839 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.086846 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:33.086871 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:33.086881 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:33.156565 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:33.156585 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:33.185756 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:33.185772 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:33.256648 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:33.256666 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:33.271243 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:33.271259 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:33.337446 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:35.839102 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:35.850275 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:35.850343 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:35.875276 1201669 cri.go:89] found id: ""
	I1218 00:47:35.875289 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.875296 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:35.875301 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:35.875361 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:35.912387 1201669 cri.go:89] found id: ""
	I1218 00:47:35.912400 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.912407 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:35.912412 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:35.912471 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:35.942361 1201669 cri.go:89] found id: ""
	I1218 00:47:35.942379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.942394 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:35.942400 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:35.942499 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:35.972562 1201669 cri.go:89] found id: ""
	I1218 00:47:35.972575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.972584 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:35.972588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:35.972644 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:35.998846 1201669 cri.go:89] found id: ""
	I1218 00:47:35.998861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.998868 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:35.998874 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:35.998952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:36.030184 1201669 cri.go:89] found id: ""
	I1218 00:47:36.030197 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.030213 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:36.030219 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:36.030292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:36.055609 1201669 cri.go:89] found id: ""
	I1218 00:47:36.055624 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.055640 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:36.055648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:36.055658 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:36.128355 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:36.128374 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:36.159887 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:36.159904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:36.229693 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:36.229712 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:36.244397 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:36.244412 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:36.308352 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:38.808637 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:38.819085 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:38.819152 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:38.844745 1201669 cri.go:89] found id: ""
	I1218 00:47:38.844758 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.844766 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:38.844771 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:38.844827 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:38.869442 1201669 cri.go:89] found id: ""
	I1218 00:47:38.869456 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.869463 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:38.869469 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:38.869531 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:38.899129 1201669 cri.go:89] found id: ""
	I1218 00:47:38.899151 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.899158 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:38.899163 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:38.899232 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:38.929158 1201669 cri.go:89] found id: ""
	I1218 00:47:38.929171 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.929178 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:38.929184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:38.929250 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:38.959987 1201669 cri.go:89] found id: ""
	I1218 00:47:38.960016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.960023 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:38.960029 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:38.960093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:38.987077 1201669 cri.go:89] found id: ""
	I1218 00:47:38.987091 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.987098 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:38.987104 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:38.987160 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:39.015227 1201669 cri.go:89] found id: ""
	I1218 00:47:39.015240 1201669 logs.go:282] 0 containers: []
	W1218 00:47:39.015257 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:39.015266 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:39.015278 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:39.044299 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:39.044322 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:39.110657 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:39.110677 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:39.127155 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:39.127171 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:39.195223 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:39.195233 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:39.195243 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:41.762478 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:41.772539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:41.772600 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:41.797940 1201669 cri.go:89] found id: ""
	I1218 00:47:41.797954 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.797961 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:41.797967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:41.798024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:41.823230 1201669 cri.go:89] found id: ""
	I1218 00:47:41.823244 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.823251 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:41.823256 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:41.823314 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:41.848348 1201669 cri.go:89] found id: ""
	I1218 00:47:41.848368 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.848384 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:41.848390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:41.848447 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:41.873186 1201669 cri.go:89] found id: ""
	I1218 00:47:41.873199 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.873207 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:41.873212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:41.873269 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:41.910240 1201669 cri.go:89] found id: ""
	I1218 00:47:41.910253 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.910260 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:41.910265 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:41.910323 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:41.938636 1201669 cri.go:89] found id: ""
	I1218 00:47:41.938649 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.938656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:41.938661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:41.938723 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:41.967011 1201669 cri.go:89] found id: ""
	I1218 00:47:41.967024 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.967031 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:41.967039 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:41.967048 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:42.032273 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:42.032293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:42.047961 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:42.047977 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:42.129763 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:42.129777 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:42.129788 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:42.203638 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:42.203661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:44.747018 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:44.757561 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:44.757666 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:44.782848 1201669 cri.go:89] found id: ""
	I1218 00:47:44.782861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.782868 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:44.782873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:44.782930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:44.812029 1201669 cri.go:89] found id: ""
	I1218 00:47:44.812042 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.812049 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:44.812054 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:44.812111 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:44.835973 1201669 cri.go:89] found id: ""
	I1218 00:47:44.835986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.835994 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:44.835998 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:44.836055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:44.865506 1201669 cri.go:89] found id: ""
	I1218 00:47:44.865524 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.865532 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:44.865539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:44.865596 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:44.895590 1201669 cri.go:89] found id: ""
	I1218 00:47:44.895603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.895610 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:44.895615 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:44.895678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:44.930517 1201669 cri.go:89] found id: ""
	I1218 00:47:44.930531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.930538 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:44.930544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:44.930602 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:44.963147 1201669 cri.go:89] found id: ""
	I1218 00:47:44.963161 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.963168 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:44.963176 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:44.963187 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:45.068693 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:45.068706 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:45.068718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:45.150525 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:45.150547 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:45.198775 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:45.198795 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:45.282633 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:45.282655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:47.798966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:47.809011 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:47.809070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:47.836141 1201669 cri.go:89] found id: ""
	I1218 00:47:47.836155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.836161 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:47.836167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:47.836256 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:47.862554 1201669 cri.go:89] found id: ""
	I1218 00:47:47.862568 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.862575 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:47.862580 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:47.862645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:47.889972 1201669 cri.go:89] found id: ""
	I1218 00:47:47.889986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.889992 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:47.889997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:47.890054 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:47.922142 1201669 cri.go:89] found id: ""
	I1218 00:47:47.922155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.922162 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:47.922168 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:47.922223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:47.956979 1201669 cri.go:89] found id: ""
	I1218 00:47:47.956993 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.956999 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:47.957005 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:47.957062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:47.982938 1201669 cri.go:89] found id: ""
	I1218 00:47:47.982952 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.982959 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:47.982965 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:47.983027 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:48.014164 1201669 cri.go:89] found id: ""
	I1218 00:47:48.014178 1201669 logs.go:282] 0 containers: []
	W1218 00:47:48.014184 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:48.014192 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:48.014205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:48.078819 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:48.078831 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:48.078850 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:48.151018 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:48.151045 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:48.178919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:48.178937 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:48.246806 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:48.246828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:50.762650 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:50.772894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:50.772953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:50.798440 1201669 cri.go:89] found id: ""
	I1218 00:47:50.798453 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.798459 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:50.798468 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:50.798525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:50.824627 1201669 cri.go:89] found id: ""
	I1218 00:47:50.824641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.824648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:50.824654 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:50.824713 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:50.849720 1201669 cri.go:89] found id: ""
	I1218 00:47:50.849732 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.849740 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:50.849745 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:50.849802 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:50.873828 1201669 cri.go:89] found id: ""
	I1218 00:47:50.873841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.873849 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:50.873854 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:50.873910 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:50.905379 1201669 cri.go:89] found id: ""
	I1218 00:47:50.905392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.905399 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:50.905404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:50.905461 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:50.935677 1201669 cri.go:89] found id: ""
	I1218 00:47:50.935690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.935697 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:50.935702 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:50.935774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:50.970057 1201669 cri.go:89] found id: ""
	I1218 00:47:50.970070 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.970077 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:50.970085 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:50.970095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:51.036789 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:51.036810 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:51.051895 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:51.051913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:51.116641 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:51.116651 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:51.116663 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:51.186315 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:51.186337 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:53.718450 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:53.728262 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:53.728318 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:53.755772 1201669 cri.go:89] found id: ""
	I1218 00:47:53.755787 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.755793 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:53.755798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:53.755855 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:53.780839 1201669 cri.go:89] found id: ""
	I1218 00:47:53.780853 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.780860 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:53.780865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:53.780929 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:53.806552 1201669 cri.go:89] found id: ""
	I1218 00:47:53.806603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.806611 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:53.806616 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:53.806672 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:53.832361 1201669 cri.go:89] found id: ""
	I1218 00:47:53.832380 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.832401 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:53.832420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:53.832492 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:53.859241 1201669 cri.go:89] found id: ""
	I1218 00:47:53.859254 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.859262 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:53.859277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:53.859335 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:53.884714 1201669 cri.go:89] found id: ""
	I1218 00:47:53.884728 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.884735 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:53.884740 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:53.884803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:53.921003 1201669 cri.go:89] found id: ""
	I1218 00:47:53.921016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.921024 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:53.921031 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:53.921041 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:54.003954 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:54.003975 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:54.020878 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:54.020896 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:54.086911 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:54.086921 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:54.086943 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:54.157859 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:54.157878 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:56.687608 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:56.697675 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:56.697732 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:56.722032 1201669 cri.go:89] found id: ""
	I1218 00:47:56.722045 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.722053 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:56.722058 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:56.722113 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:56.746685 1201669 cri.go:89] found id: ""
	I1218 00:47:56.746698 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.746705 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:56.746712 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:56.746769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:56.771487 1201669 cri.go:89] found id: ""
	I1218 00:47:56.771500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.771508 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:56.771515 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:56.771571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:56.795765 1201669 cri.go:89] found id: ""
	I1218 00:47:56.795778 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.795785 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:56.795790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:56.795845 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:56.820457 1201669 cri.go:89] found id: ""
	I1218 00:47:56.820470 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.820477 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:56.820482 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:56.820543 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:56.844750 1201669 cri.go:89] found id: ""
	I1218 00:47:56.844764 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.844788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:56.844794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:56.844859 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:56.870299 1201669 cri.go:89] found id: ""
	I1218 00:47:56.870312 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.870319 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:56.870326 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:56.870336 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:56.957977 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:56.957986 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:56.957996 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:57.026903 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:57.026922 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:57.056057 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:57.056072 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:57.122322 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:57.122341 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.637384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:59.647089 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:59.647147 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:59.675785 1201669 cri.go:89] found id: ""
	I1218 00:47:59.675798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.675805 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:59.675811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:59.675868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:59.700863 1201669 cri.go:89] found id: ""
	I1218 00:47:59.700876 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.700883 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:59.700888 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:59.700951 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:59.726366 1201669 cri.go:89] found id: ""
	I1218 00:47:59.726379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.726388 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:59.726394 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:59.726449 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:59.754806 1201669 cri.go:89] found id: ""
	I1218 00:47:59.754819 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.754826 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:59.754832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:59.754887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:59.779823 1201669 cri.go:89] found id: ""
	I1218 00:47:59.779842 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.779850 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:59.779855 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:59.779931 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:59.809497 1201669 cri.go:89] found id: ""
	I1218 00:47:59.809511 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.809519 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:59.809524 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:59.809580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:59.834274 1201669 cri.go:89] found id: ""
	I1218 00:47:59.834287 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.834294 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:59.834302 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:59.834312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:59.908086 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:59.908107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.923555 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:59.923571 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:59.996659 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:59.996668 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:59.996679 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:48:00.245332 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:48:00.245355 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:48:02.854946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:48:02.865088 1201669 kubeadm.go:602] duration metric: took 4m2.280648529s to restartPrimaryControlPlane
	W1218 00:48:02.865154 1201669 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1218 00:48:02.865291 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:48:03.285302 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:48:03.298386 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:48:03.307630 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:48:03.307686 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:48:03.316384 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:48:03.316392 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:48:03.316448 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:48:03.324266 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:48:03.324330 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:48:03.332001 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:48:03.339756 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:48:03.339811 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:48:03.347895 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.356395 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:48:03.356451 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.364239 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:48:03.373496 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:48:03.373555 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:48:03.380932 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:48:03.422222 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:48:03.422277 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:48:03.498554 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:48:03.498619 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:48:03.498653 1201669 kubeadm.go:319] OS: Linux
	I1218 00:48:03.498697 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:48:03.498750 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:48:03.498797 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:48:03.498844 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:48:03.498890 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:48:03.498939 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:48:03.498983 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:48:03.499030 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:48:03.499077 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:48:03.575694 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:48:03.575807 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:48:03.575895 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:48:03.584731 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:48:03.590040 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:48:03.590125 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:48:03.590198 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:48:03.590273 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:48:03.590332 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:48:03.590401 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:48:03.590455 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:48:03.590517 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:48:03.590577 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:48:03.590649 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:48:03.590726 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:48:03.590762 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:48:03.590820 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:48:03.968959 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:48:04.492311 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:48:04.657077 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:48:05.347391 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:48:06.111689 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:48:06.112246 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:48:06.114858 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:48:06.118151 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:48:06.118267 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:48:06.118369 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:48:06.118440 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:48:06.133862 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:48:06.134164 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:48:06.143224 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:48:06.143316 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:48:06.143354 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:48:06.274772 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:48:06.274905 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:52:06.274474 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000113635s
	I1218 00:52:06.274499 1201669 kubeadm.go:319] 
	I1218 00:52:06.274555 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:52:06.274586 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:52:06.274697 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:52:06.274703 1201669 kubeadm.go:319] 
	I1218 00:52:06.274816 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:52:06.274846 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:52:06.274874 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:52:06.274877 1201669 kubeadm.go:319] 
	I1218 00:52:06.279422 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:52:06.279849 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:52:06.279958 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:52:06.280242 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1218 00:52:06.280248 1201669 kubeadm.go:319] 
	I1218 00:52:06.280323 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1218 00:52:06.280425 1201669 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000113635s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1218 00:52:06.280513 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:52:06.687216 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:52:06.699735 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:52:06.699788 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:52:06.707587 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:52:06.707598 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:52:06.707647 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:52:06.715175 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:52:06.715229 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:52:06.722487 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:52:06.729668 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:52:06.729722 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:52:06.736814 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.744131 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:52:06.744183 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.751469 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:52:06.758728 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:52:06.758782 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:52:06.765652 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:52:06.801363 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:52:06.801639 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:52:06.871618 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:52:06.871677 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:52:06.871709 1201669 kubeadm.go:319] OS: Linux
	I1218 00:52:06.871750 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:52:06.871795 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:52:06.871839 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:52:06.871883 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:52:06.871926 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:52:06.871970 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:52:06.872012 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:52:06.872056 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:52:06.872097 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:52:06.943596 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:52:06.943710 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:52:06.943809 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:52:06.952719 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:52:06.957986 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:52:06.958071 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:52:06.958134 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:52:06.958209 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:52:06.958270 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:52:06.958342 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:52:06.958395 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:52:06.958469 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:52:06.958529 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:52:06.958603 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:52:06.958674 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:52:06.958710 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:52:06.958765 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:52:07.159266 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:52:07.543682 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:52:07.621245 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:52:07.789755 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:52:08.258810 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:52:08.259464 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:52:08.262206 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:52:08.265520 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:52:08.265615 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:52:08.265696 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:52:08.266218 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:52:08.282138 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:52:08.282258 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:52:08.290066 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:52:08.290407 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:52:08.290607 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:52:08.422232 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:52:08.422344 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:56:08.423339 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001129518s
	I1218 00:56:08.423364 1201669 kubeadm.go:319] 
	I1218 00:56:08.423420 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:56:08.423452 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:56:08.423565 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:56:08.423570 1201669 kubeadm.go:319] 
	I1218 00:56:08.423755 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:56:08.423825 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:56:08.423872 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:56:08.423876 1201669 kubeadm.go:319] 
	I1218 00:56:08.428596 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:56:08.429049 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:56:08.429151 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:56:08.429380 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 00:56:08.429383 1201669 kubeadm.go:319] 
	I1218 00:56:08.429447 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 00:56:08.429502 1201669 kubeadm.go:403] duration metric: took 12m7.881074518s to StartCluster
	I1218 00:56:08.429533 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:56:08.429592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:56:08.454446 1201669 cri.go:89] found id: ""
	I1218 00:56:08.454459 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.454467 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:56:08.454472 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:56:08.454527 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:56:08.479309 1201669 cri.go:89] found id: ""
	I1218 00:56:08.479323 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.479330 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:56:08.479335 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:56:08.479395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:56:08.506727 1201669 cri.go:89] found id: ""
	I1218 00:56:08.506740 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.506747 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:56:08.506752 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:56:08.506809 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:56:08.531214 1201669 cri.go:89] found id: ""
	I1218 00:56:08.531228 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.531235 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:56:08.531240 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:56:08.531295 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:56:08.555634 1201669 cri.go:89] found id: ""
	I1218 00:56:08.555647 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.555654 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:56:08.555659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:56:08.555716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:56:08.580409 1201669 cri.go:89] found id: ""
	I1218 00:56:08.580423 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.580430 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:56:08.580435 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:56:08.580494 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:56:08.605063 1201669 cri.go:89] found id: ""
	I1218 00:56:08.605089 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.605096 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:56:08.605105 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:56:08.605116 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:56:08.684346 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:56:08.684356 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:56:08.684367 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:56:08.760495 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:56:08.760515 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:56:08.787919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:56:08.787936 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:56:08.853642 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:56:08.853661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	W1218 00:56:08.868901 1201669 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 00:56:08.868939 1201669 out.go:285] * 
	W1218 00:56:08.868999 1201669 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.869015 1201669 out.go:285] * 
	W1218 00:56:08.871456 1201669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:56:08.877860 1201669 out.go:203] 
	W1218 00:56:08.880779 1201669 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.880832 1201669 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 00:56:08.880854 1201669 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 00:56:08.883989 1201669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113118431Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113153129Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113189559Z" level=info msg="Create NRI interface"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113282086Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113290964Z" level=info msg="runtime interface created"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113301647Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113309343Z" level=info msg="runtime interface starting up..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113315505Z" level=info msg="starting plugins..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113327796Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.11339067Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:43:59 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.578897723Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=a394bef7-706e-4c2b-a83c-e7a192425f8f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.579569606Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=0b73d3f0-8cf4-4881-9be6-303c65310a78 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58003914Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=ba617c6c-560d-48a4-8069-49b5cad617df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58069138Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=1b435c90-bcae-4d5e-85b5-8f24b84aad77 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581151364Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=1ca4dc15-0b08-49d0-89ca-728ba68fd7be name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581562446Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=9758cff4-6113-4178-8c9f-4ef34a0e91ee name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581979017Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=34a384cc-3abb-4525-b194-0557e1231baf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:56:12.263044   21382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:12.263577   21382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:12.264700   21382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:12.265247   21382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:12.266798   21382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:56:12 up  7:38,  0 user,  load average: 0.03, 0.16, 0.38
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:56:09 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2130.
	Dec 18 00:56:10 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:10 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:10 functional-288604 kubelet[21249]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:10 functional-288604 kubelet[21249]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:10 functional-288604 kubelet[21249]: E1218 00:56:10.188585   21249 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2131.
	Dec 18 00:56:10 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:10 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:10 functional-288604 kubelet[21274]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:10 functional-288604 kubelet[21274]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:10 functional-288604 kubelet[21274]: E1218 00:56:10.952297   21274 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:56:10 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:56:11 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2132.
	Dec 18 00:56:11 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:11 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:56:11 functional-288604 kubelet[21299]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:11 functional-288604 kubelet[21299]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:56:11 functional-288604 kubelet[21299]: E1218 00:56:11.699507   21299 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:56:11 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:56:11 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (377.753621ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth (2.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-288604 apply -f testdata/invalidsvc.yaml
functional_test.go:2326: (dbg) Non-zero exit: kubectl --context functional-288604 apply -f testdata/invalidsvc.yaml: exit status 1 (59.455279ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/invalidsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test.go:2328: kubectl --context functional-288604 apply -f testdata/invalidsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd (1.73s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-288604 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-288604 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-288604 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-288604 --alsologtostderr -v=1] stderr:
I1218 00:58:39.813758 1219187 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:39.813910 1219187 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:39.813928 1219187 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:39.813945 1219187 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:39.814216 1219187 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:39.814490 1219187 mustload.go:66] Loading cluster: functional-288604
I1218 00:58:39.814924 1219187 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:39.815410 1219187 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:39.830986 1219187 host.go:66] Checking if "functional-288604" exists ...
I1218 00:58:39.831316 1219187 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1218 00:58:39.884662 1219187 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.87556633 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1218 00:58:39.884785 1219187 api_server.go:166] Checking apiserver status ...
I1218 00:58:39.884859 1219187 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1218 00:58:39.884900 1219187 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:39.901836 1219187 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
W1218 00:58:40.035367 1219187 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1218 00:58:40.043264 1219187 out.go:179] * The control-plane node functional-288604 apiserver is not running: (state=Stopped)
I1218 00:58:40.046438 1219187 out.go:179]   To start a cluster, run: "minikube start -p functional-288604"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (326.75373ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service   │ functional-288604 service hello-node --url                                                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount     │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001:/mount-9p --alsologtostderr -v=1               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh -- ls -la /mount-9p                                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh cat /mount-9p/test-1766019510054259136                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh sudo umount -f /mount-9p                                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ mount     │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3660484500/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh -- ls -la /mount-9p                                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh sudo umount -f /mount-9p                                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount     │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount1 --alsologtostderr -v=1                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount     │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount2 --alsologtostderr -v=1                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount     │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount3 --alsologtostderr -v=1                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount1                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount1                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh findmnt -T /mount2                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh findmnt -T /mount3                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ mount     │ -p functional-288604 --kill=true                                                                                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ start     │ -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ start     │ -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ start     │ -p functional-288604 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-288604 --alsologtostderr -v=1                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:58:39
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:58:39.583243 1219115 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:58:39.583636 1219115 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.583653 1219115 out.go:374] Setting ErrFile to fd 2...
	I1218 00:58:39.583659 1219115 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.584016 1219115 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:58:39.584840 1219115 out.go:368] Setting JSON to false
	I1218 00:58:39.585815 1219115 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":27668,"bootTime":1765991852,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:58:39.585921 1219115 start.go:143] virtualization:  
	I1218 00:58:39.589258 1219115 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:58:39.593010 1219115 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:58:39.593078 1219115 notify.go:221] Checking for updates...
	I1218 00:58:39.598753 1219115 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:58:39.601710 1219115 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:58:39.604519 1219115 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:58:39.607336 1219115 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:58:39.610130 1219115 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:58:39.613458 1219115 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:58:39.614052 1219115 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:58:39.641766 1219115 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:58:39.641879 1219115 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.697740 1219115 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.688189308 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.697856 1219115 docker.go:319] overlay module found
	I1218 00:58:39.701039 1219115 out.go:179] * Using the docker driver based on existing profile
	I1218 00:58:39.703916 1219115 start.go:309] selected driver: docker
	I1218 00:58:39.703938 1219115 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bin
aryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.704044 1219115 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:58:39.704152 1219115 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.758556 1219115 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.749355176 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.758951 1219115 cni.go:84] Creating CNI manager for ""
	I1218 00:58:39.759012 1219115 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:58:39.759062 1219115 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.762264 1219115 out.go:179] * dry-run validation complete!
	
	
	==> CRI-O <==
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113118431Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113153129Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113189559Z" level=info msg="Create NRI interface"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113282086Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113290964Z" level=info msg="runtime interface created"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113301647Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113309343Z" level=info msg="runtime interface starting up..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113315505Z" level=info msg="starting plugins..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113327796Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.11339067Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:43:59 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.578897723Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=a394bef7-706e-4c2b-a83c-e7a192425f8f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.579569606Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=0b73d3f0-8cf4-4881-9be6-303c65310a78 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58003914Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=ba617c6c-560d-48a4-8069-49b5cad617df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58069138Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=1b435c90-bcae-4d5e-85b5-8f24b84aad77 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581151364Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=1ca4dc15-0b08-49d0-89ca-728ba68fd7be name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581562446Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=9758cff4-6113-4178-8c9f-4ef34a0e91ee name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581979017Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=34a384cc-3abb-4525-b194-0557e1231baf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:58:41.113537   23615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:41.114341   23615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:41.116039   23615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:41.116543   23615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:41.118085   23615 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:58:41 up  7:41,  0 user,  load average: 1.47, 0.49, 0.45
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:58:38 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:39 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2329.
	Dec 18 00:58:39 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:39 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:39 functional-288604 kubelet[23497]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:39 functional-288604 kubelet[23497]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:39 functional-288604 kubelet[23497]: E1218 00:58:39.449011   23497 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:39 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:39 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:40 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2330.
	Dec 18 00:58:40 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:40 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:40 functional-288604 kubelet[23511]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:40 functional-288604 kubelet[23511]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:40 functional-288604 kubelet[23511]: E1218 00:58:40.209142   23511 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:40 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:40 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:40 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2331.
	Dec 18 00:58:40 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:40 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:40 functional-288604 kubelet[23574]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:40 functional-288604 kubelet[23574]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:40 functional-288604 kubelet[23574]: E1218 00:58:40.948297   23574 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:40 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:40 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (298.752114ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd (1.73s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd (3.09s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 status
functional_test.go:869: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 status: exit status 2 (335.176221ms)

                                                
                                                
-- stdout --
	functional-288604
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
functional_test.go:871: failed to run minikube status. args "out/minikube-linux-arm64 -p functional-288604 status" : exit status 2
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:875: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}: exit status 2 (305.202766ms)

                                                
                                                
-- stdout --
	host:Running,kublet:Stopped,apiserver:Stopped,kubeconfig:Configured

                                                
                                                
-- /stdout --
functional_test.go:877: failed to run minikube status with custom format: args "out/minikube-linux-arm64 -p functional-288604 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}": exit status 2
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 status -o json
functional_test.go:887: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 status -o json: exit status 2 (324.663273ms)

                                                
                                                
-- stdout --
	{"Name":"functional-288604","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
functional_test.go:889: failed to run minikube status with json output. args "out/minikube-linux-arm64 -p functional-288604 status -o json" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (332.746302ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service │ functional-288604 service list                                                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ service │ functional-288604 service list -o json                                                                                                              │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ service │ functional-288604 service --namespace=default --https --url hello-node                                                                              │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ service │ functional-288604 service hello-node --url --format={{.IP}}                                                                                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ service │ functional-288604 service hello-node --url                                                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount   │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001:/mount-9p --alsologtostderr -v=1               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh -- ls -la /mount-9p                                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh cat /mount-9p/test-1766019510054259136                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh sudo umount -f /mount-9p                                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ mount   │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3660484500/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh -- ls -la /mount-9p                                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh sudo umount -f /mount-9p                                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount   │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount1 --alsologtostderr -v=1                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount   │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount2 --alsologtostderr -v=1                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ mount   │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount3 --alsologtostderr -v=1                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh findmnt -T /mount1                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh     │ functional-288604 ssh findmnt -T /mount1                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh findmnt -T /mount2                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh     │ functional-288604 ssh findmnt -T /mount3                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ mount   │ -p functional-288604 --kill=true                                                                                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:43:55
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:43:55.978742 1201669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:43:55.978849 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.978853 1201669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:43:55.978857 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.979124 1201669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:43:55.979466 1201669 out.go:368] Setting JSON to false
	I1218 00:43:55.980315 1201669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26784,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:43:55.980372 1201669 start.go:143] virtualization:  
	I1218 00:43:55.983789 1201669 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:43:55.987542 1201669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:43:55.987604 1201669 notify.go:221] Checking for updates...
	I1218 00:43:55.993164 1201669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:43:55.995954 1201669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:43:55.999614 1201669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:43:56.002831 1201669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:43:56.005802 1201669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:43:56.009212 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:56.009315 1201669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:43:56.041210 1201669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:43:56.041338 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.105588 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.095254501 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.105683 1201669 docker.go:319] overlay module found
	I1218 00:43:56.108792 1201669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:43:56.111628 1201669 start.go:309] selected driver: docker
	I1218 00:43:56.111638 1201669 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.111765 1201669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:43:56.111873 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.170180 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.160520969 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.170597 1201669 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:43:56.170621 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:56.170672 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:56.170715 1201669 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.173990 1201669 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:43:56.177055 1201669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:43:56.179992 1201669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:43:56.182847 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:56.182889 1201669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:43:56.182897 1201669 cache.go:65] Caching tarball of preloaded images
	I1218 00:43:56.182969 1201669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:43:56.182979 1201669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:43:56.182988 1201669 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:43:56.183103 1201669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:43:56.202673 1201669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:43:56.202684 1201669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:43:56.202702 1201669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:43:56.202743 1201669 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:43:56.202797 1201669 start.go:364] duration metric: took 37.488µs to acquireMachinesLock for "functional-288604"
	I1218 00:43:56.202818 1201669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:43:56.202823 1201669 fix.go:54] fixHost starting: 
	I1218 00:43:56.203129 1201669 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:43:56.220546 1201669 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:43:56.220565 1201669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:43:56.223742 1201669 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:43:56.223770 1201669 machine.go:94] provisionDockerMachine start ...
	I1218 00:43:56.223861 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.243517 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.243858 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.243865 1201669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:43:56.399607 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.399622 1201669 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:43:56.399683 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.417287 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.417598 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.417605 1201669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:43:56.583098 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.583184 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.603369 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.603669 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.603683 1201669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:43:56.772929 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:43:56.772944 1201669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:43:56.772975 1201669 ubuntu.go:190] setting up certificates
	I1218 00:43:56.772989 1201669 provision.go:84] configureAuth start
	I1218 00:43:56.773070 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:56.789980 1201669 provision.go:143] copyHostCerts
	I1218 00:43:56.790044 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:43:56.790056 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:43:56.790131 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:43:56.790231 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:43:56.790235 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:43:56.790260 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:43:56.790310 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:43:56.790313 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:43:56.790335 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:43:56.790376 1201669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:43:56.986120 1201669 provision.go:177] copyRemoteCerts
	I1218 00:43:56.986182 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:43:56.986224 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.010906 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.115839 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:43:57.132835 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:43:57.150663 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:43:57.167535 1201669 provision.go:87] duration metric: took 394.523589ms to configureAuth
	I1218 00:43:57.167552 1201669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:43:57.167745 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:57.167846 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.184649 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:57.184955 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:57.184966 1201669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:43:57.547661 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:43:57.547677 1201669 machine.go:97] duration metric: took 1.323900056s to provisionDockerMachine
	I1218 00:43:57.547689 1201669 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:43:57.547701 1201669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:43:57.547767 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:43:57.547816 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.568532 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.675839 1201669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:43:57.679095 1201669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:43:57.679112 1201669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:43:57.679121 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:43:57.679176 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:43:57.679251 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:43:57.679324 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:43:57.679367 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:43:57.686719 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:57.703522 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:43:57.720871 1201669 start.go:296] duration metric: took 173.166293ms for postStartSetup
	I1218 00:43:57.720943 1201669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:43:57.720983 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.737854 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.841489 1201669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:43:57.846519 1201669 fix.go:56] duration metric: took 1.643688341s for fixHost
	I1218 00:43:57.846534 1201669 start.go:83] releasing machines lock for "functional-288604", held for 1.6437309s
	I1218 00:43:57.846614 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:57.862813 1201669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:43:57.862836 1201669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:43:57.862859 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.862906 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.880942 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.881296 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.984097 1201669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:43:58.077458 1201669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:43:58.117786 1201669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:43:58.128203 1201669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:43:58.128283 1201669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:43:58.137853 1201669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:43:58.137867 1201669 start.go:496] detecting cgroup driver to use...
	I1218 00:43:58.137898 1201669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:43:58.137955 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:43:58.154333 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:43:58.171243 1201669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:43:58.171317 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:43:58.187629 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:43:58.200443 1201669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:43:58.332309 1201669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:43:58.456320 1201669 docker.go:234] disabling docker service ...
	I1218 00:43:58.456386 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:43:58.471261 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:43:58.484090 1201669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:43:58.600872 1201669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:43:58.712059 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:43:58.725312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:43:58.738398 1201669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:43:58.738467 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.746850 1201669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:43:58.746917 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.755273 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.763400 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.771727 1201669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:43:58.779324 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.788210 1201669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.796348 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.804389 1201669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:43:58.811403 1201669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:43:58.818408 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:58.951912 1201669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:43:59.118783 1201669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:43:59.118849 1201669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:43:59.122545 1201669 start.go:564] Will wait 60s for crictl version
	I1218 00:43:59.122604 1201669 ssh_runner.go:195] Run: which crictl
	I1218 00:43:59.126019 1201669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:43:59.148982 1201669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:43:59.149067 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.175940 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.206912 1201669 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:43:59.209698 1201669 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:43:59.225649 1201669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:43:59.232549 1201669 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1218 00:43:59.235431 1201669 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disabl
eOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:43:59.235543 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:59.235614 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.273407 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.273418 1201669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:43:59.273471 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.299275 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.299287 1201669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:43:59.299293 1201669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:43:59.299404 1201669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:43:59.299490 1201669 ssh_runner.go:195] Run: crio config
	I1218 00:43:59.362084 1201669 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1218 00:43:59.362106 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:59.362113 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:59.362126 1201669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:43:59.362149 1201669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOp
ts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:43:59.362277 1201669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:43:59.362352 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:43:59.369805 1201669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:43:59.369864 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:43:59.376968 1201669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:43:59.388765 1201669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:43:59.400454 1201669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2069 bytes)
	I1218 00:43:59.412514 1201669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:43:59.416040 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:59.531606 1201669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:43:59.640794 1201669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:43:59.640805 1201669 certs.go:195] generating shared ca certs ...
	I1218 00:43:59.640830 1201669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:43:59.640959 1201669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:43:59.641001 1201669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:43:59.641007 1201669 certs.go:257] generating profile certs ...
	I1218 00:43:59.641121 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:43:59.641164 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:43:59.641201 1201669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:43:59.641309 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:43:59.641337 1201669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:43:59.641343 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:43:59.641373 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:43:59.641395 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:43:59.641423 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:43:59.641463 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:59.642073 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:43:59.660992 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:43:59.679818 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:43:59.699150 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:43:59.718895 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:43:59.738413 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:43:59.756315 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:43:59.773826 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:43:59.791059 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:43:59.807447 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:43:59.824212 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:43:59.841186 1201669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:43:59.853492 1201669 ssh_runner.go:195] Run: openssl version
	I1218 00:43:59.859998 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.866869 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:43:59.873885 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877278 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877331 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.917714 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:43:59.925047 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.932048 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:43:59.939101 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942813 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942866 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.983421 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:43:59.990593 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:43:59.997725 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:44:00.042943 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059312 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059393 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.179416 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:44:00.199517 1201669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:44:00.211411 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:44:00.299862 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:44:00.347783 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:44:00.400161 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:44:00.445236 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:44:00.505288 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:44:00.548440 1201669 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:44:00.548538 1201669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:44:00.548659 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.576531 1201669 cri.go:89] found id: ""
	I1218 00:44:00.576602 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:44:00.584414 1201669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:44:00.584430 1201669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:44:00.584481 1201669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:44:00.591678 1201669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.592197 1201669 kubeconfig.go:125] found "functional-288604" server: "https://192.168.49.2:8441"
	I1218 00:44:00.593407 1201669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:44:00.601066 1201669 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-18 00:29:23.211763247 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-18 00:43:59.405160305 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1218 00:44:00.601075 1201669 kubeadm.go:1161] stopping kube-system containers ...
	I1218 00:44:00.601085 1201669 cri.go:54] listing CRI containers in root : {State:all Name: Namespaces:[kube-system]}
	I1218 00:44:00.601140 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.626991 1201669 cri.go:89] found id: ""
	I1218 00:44:00.627065 1201669 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1218 00:44:00.640495 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:44:00.648256 1201669 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 18 00:33 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec 18 00:33 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 18 00:33 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 18 00:33 /etc/kubernetes/scheduler.conf
	
	I1218 00:44:00.648311 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:44:00.655772 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:44:00.663347 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.663410 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:44:00.670748 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.677977 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.678031 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.685079 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:44:00.692996 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.693049 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:44:00.700106 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:44:00.707647 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:00.751682 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:01.971643 1201669 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.219916809s)
	I1218 00:44:01.971736 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.213563 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.279593 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.331094 1201669 api_server.go:52] waiting for apiserver process to appear ...
	I1218 00:44:02.331177 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:02.831338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.332205 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.831381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.331525 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.832325 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.331379 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.332243 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.831869 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.331354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.831326 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.331942 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.831354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.331370 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.832255 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.331366 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.831363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.831359 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.331357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.331990 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.831891 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.331340 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.832123 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.331341 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.332060 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.831352 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.331755 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.831466 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.331860 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.831293 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.831369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.331585 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.832126 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.331328 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.831986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.331369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.831627 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.331975 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.831268 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.331992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.831394 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.331896 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.831502 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.331383 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.831706 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.332082 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.831353 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.331380 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.832133 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.331347 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.831351 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.332001 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.831800 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.331774 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.831372 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.332276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.832017 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.331329 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.832065 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.331713 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.831374 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.331600 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.332164 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.831455 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.331933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.831358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.332063 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.831460 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.331554 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.832152 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.331280 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.831272 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.332273 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.832020 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.331662 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.831758 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.331412 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.831371 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.332088 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.831480 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.332490 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.832201 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.331816 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.831276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.331408 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.831739 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.331262 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.831814 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.332083 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.832108 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.331984 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.831507 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.331363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.831505 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.332120 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.831384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.332279 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.831590 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.331361 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.831933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.331338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.332254 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.832148 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.331950 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.831349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.332302 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.832264 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.331912 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.832145 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.331498 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.831848 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.331497 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:02.332289 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:02.332395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:02.358402 1201669 cri.go:89] found id: ""
	I1218 00:45:02.358416 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.358424 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:02.358429 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:02.358493 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:02.386799 1201669 cri.go:89] found id: ""
	I1218 00:45:02.386814 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.386821 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:02.386825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:02.386882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:02.419430 1201669 cri.go:89] found id: ""
	I1218 00:45:02.419445 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.419453 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:02.419460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:02.419560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:02.445313 1201669 cri.go:89] found id: ""
	I1218 00:45:02.445326 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.445333 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:02.445338 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:02.445395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:02.474189 1201669 cri.go:89] found id: ""
	I1218 00:45:02.474203 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.474210 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:02.474215 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:02.474278 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:02.501782 1201669 cri.go:89] found id: ""
	I1218 00:45:02.501796 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.501803 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:02.501808 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:02.501867 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:02.531648 1201669 cri.go:89] found id: ""
	I1218 00:45:02.531662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.531669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:02.531677 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:02.531690 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:02.597077 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:02.597095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:02.612827 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:02.612845 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:02.680833 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:02.680844 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:02.680855 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:02.749861 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:02.749884 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:05.287966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:05.298109 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:05.298171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:05.323714 1201669 cri.go:89] found id: ""
	I1218 00:45:05.323727 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.323733 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:05.323739 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:05.323800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:05.348520 1201669 cri.go:89] found id: ""
	I1218 00:45:05.348534 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.348541 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:05.348546 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:05.348604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:05.373275 1201669 cri.go:89] found id: ""
	I1218 00:45:05.373290 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.373297 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:05.373302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:05.373362 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:05.397833 1201669 cri.go:89] found id: ""
	I1218 00:45:05.397846 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.397853 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:05.397859 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:05.397921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:05.422938 1201669 cri.go:89] found id: ""
	I1218 00:45:05.422952 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.422959 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:05.422964 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:05.423026 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:05.451027 1201669 cri.go:89] found id: ""
	I1218 00:45:05.451041 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.451048 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:05.451053 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:05.451115 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:05.477082 1201669 cri.go:89] found id: ""
	I1218 00:45:05.477096 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.477102 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:05.477110 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:05.477120 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:05.543065 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:05.543083 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:05.558032 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:05.558047 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:05.623058 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:05.623071 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:05.623081 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:05.694967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:05.694987 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.224381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:08.234565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:08.234639 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:08.262640 1201669 cri.go:89] found id: ""
	I1218 00:45:08.262654 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.262661 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:08.262667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:08.262724 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:08.288384 1201669 cri.go:89] found id: ""
	I1218 00:45:08.288397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.288404 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:08.288409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:08.288468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:08.314880 1201669 cri.go:89] found id: ""
	I1218 00:45:08.314893 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.314900 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:08.314911 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:08.314971 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:08.340105 1201669 cri.go:89] found id: ""
	I1218 00:45:08.340119 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.340125 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:08.340131 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:08.340202 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:08.370009 1201669 cri.go:89] found id: ""
	I1218 00:45:08.370023 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.370030 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:08.370035 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:08.370094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:08.394925 1201669 cri.go:89] found id: ""
	I1218 00:45:08.394939 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.394946 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:08.394951 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:08.395013 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:08.419448 1201669 cri.go:89] found id: ""
	I1218 00:45:08.419462 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.419469 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:08.419477 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:08.419487 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:08.493271 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:08.493290 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.521236 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:08.521251 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:08.591011 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:08.591030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:08.605700 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:08.605716 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:08.674615 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.175336 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:11.186731 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:11.186790 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:11.217495 1201669 cri.go:89] found id: ""
	I1218 00:45:11.217510 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.217517 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:11.217522 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:11.217579 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:11.242494 1201669 cri.go:89] found id: ""
	I1218 00:45:11.242506 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.242514 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:11.242519 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:11.242588 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:11.269562 1201669 cri.go:89] found id: ""
	I1218 00:45:11.269576 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.269583 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:11.269588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:11.269646 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:11.296483 1201669 cri.go:89] found id: ""
	I1218 00:45:11.296497 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.296503 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:11.296517 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:11.296573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:11.324023 1201669 cri.go:89] found id: ""
	I1218 00:45:11.324037 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.324044 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:11.324049 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:11.324107 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:11.350813 1201669 cri.go:89] found id: ""
	I1218 00:45:11.350826 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.350833 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:11.350838 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:11.350915 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:11.375508 1201669 cri.go:89] found id: ""
	I1218 00:45:11.375522 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.375529 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:11.375538 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:11.375548 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:11.443170 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:11.443196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:11.458193 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:11.458209 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:11.526119 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.526129 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:11.526139 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:11.598390 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:11.598409 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:14.127470 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:14.140176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:14.140248 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:14.175467 1201669 cri.go:89] found id: ""
	I1218 00:45:14.175481 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.175488 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:14.175493 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:14.175550 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:14.205623 1201669 cri.go:89] found id: ""
	I1218 00:45:14.205637 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.205649 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:14.205655 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:14.205727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:14.232765 1201669 cri.go:89] found id: ""
	I1218 00:45:14.232779 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.232786 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:14.232790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:14.232848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:14.259382 1201669 cri.go:89] found id: ""
	I1218 00:45:14.259396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.259403 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:14.259408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:14.259465 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:14.284118 1201669 cri.go:89] found id: ""
	I1218 00:45:14.284132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.284139 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:14.284144 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:14.284205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:14.308510 1201669 cri.go:89] found id: ""
	I1218 00:45:14.308530 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.308536 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:14.308552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:14.308619 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:14.336798 1201669 cri.go:89] found id: ""
	I1218 00:45:14.336811 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.336819 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:14.336826 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:14.336837 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:14.402054 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:14.402074 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:14.416289 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:14.416306 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:14.480242 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:14.480255 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:14.480265 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:14.549733 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:14.549753 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:17.078515 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:17.088248 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:17.088306 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:17.112977 1201669 cri.go:89] found id: ""
	I1218 00:45:17.112990 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.112998 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:17.113004 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:17.113062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:17.137141 1201669 cri.go:89] found id: ""
	I1218 00:45:17.137154 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.137161 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:17.137167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:17.137223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:17.166013 1201669 cri.go:89] found id: ""
	I1218 00:45:17.166026 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.166033 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:17.166038 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:17.166098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:17.194884 1201669 cri.go:89] found id: ""
	I1218 00:45:17.194906 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.194920 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:17.194925 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:17.194990 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:17.220329 1201669 cri.go:89] found id: ""
	I1218 00:45:17.220342 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.220349 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:17.220354 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:17.220415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:17.248333 1201669 cri.go:89] found id: ""
	I1218 00:45:17.248347 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.248353 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:17.248359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:17.248415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:17.273057 1201669 cri.go:89] found id: ""
	I1218 00:45:17.273074 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.273084 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:17.273093 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:17.273104 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:17.339448 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:17.339467 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:17.354635 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:17.354652 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:17.422682 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:17.422703 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:17.422714 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:17.490930 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:17.490951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.021992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:20.032625 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:20.032687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:20.060698 1201669 cri.go:89] found id: ""
	I1218 00:45:20.060712 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.060719 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:20.060724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:20.060785 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:20.086680 1201669 cri.go:89] found id: ""
	I1218 00:45:20.086694 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.086701 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:20.086706 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:20.086766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:20.112553 1201669 cri.go:89] found id: ""
	I1218 00:45:20.112567 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.112574 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:20.112579 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:20.112642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:20.137056 1201669 cri.go:89] found id: ""
	I1218 00:45:20.137070 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.137077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:20.137082 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:20.137148 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:20.175745 1201669 cri.go:89] found id: ""
	I1218 00:45:20.175758 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.175775 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:20.175780 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:20.175848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:20.205557 1201669 cri.go:89] found id: ""
	I1218 00:45:20.205570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.205578 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:20.205583 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:20.205645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:20.234726 1201669 cri.go:89] found id: ""
	I1218 00:45:20.234739 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.234746 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:20.234754 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:20.234773 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:20.303025 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:20.303044 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.331096 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:20.331118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:20.397831 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:20.397856 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:20.412745 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:20.412761 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:20.480267 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:22.980543 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:22.990690 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:22.990747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:23.016762 1201669 cri.go:89] found id: ""
	I1218 00:45:23.016795 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.016802 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:23.016807 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:23.016868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:23.042294 1201669 cri.go:89] found id: ""
	I1218 00:45:23.042308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.042315 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:23.042320 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:23.042379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:23.071377 1201669 cri.go:89] found id: ""
	I1218 00:45:23.071392 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.071399 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:23.071405 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:23.071463 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:23.096911 1201669 cri.go:89] found id: ""
	I1218 00:45:23.096925 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.096932 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:23.096938 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:23.097002 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:23.123350 1201669 cri.go:89] found id: ""
	I1218 00:45:23.123363 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.123370 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:23.123375 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:23.123455 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:23.156384 1201669 cri.go:89] found id: ""
	I1218 00:45:23.156397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.156404 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:23.156409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:23.156470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:23.198764 1201669 cri.go:89] found id: ""
	I1218 00:45:23.198777 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.198784 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:23.198792 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:23.198802 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:23.276991 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:23.277016 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:23.305929 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:23.305946 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:23.374243 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:23.374263 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:23.389391 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:23.389408 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:23.455741 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:25.956010 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:25.966329 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:25.966402 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:25.992362 1201669 cri.go:89] found id: ""
	I1218 00:45:25.992376 1201669 logs.go:282] 0 containers: []
	W1218 00:45:25.992383 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:25.992388 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:25.992446 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:26.020474 1201669 cri.go:89] found id: ""
	I1218 00:45:26.020487 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.020495 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:26.020500 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:26.020562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:26.053060 1201669 cri.go:89] found id: ""
	I1218 00:45:26.053083 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.053090 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:26.053096 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:26.053168 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:26.080555 1201669 cri.go:89] found id: ""
	I1218 00:45:26.080570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.080577 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:26.080582 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:26.080642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:26.106383 1201669 cri.go:89] found id: ""
	I1218 00:45:26.106396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.106405 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:26.106413 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:26.106472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:26.133033 1201669 cri.go:89] found id: ""
	I1218 00:45:26.133046 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.133053 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:26.133059 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:26.133114 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:26.166644 1201669 cri.go:89] found id: ""
	I1218 00:45:26.166662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.166669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:26.166683 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:26.166693 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:26.249137 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:26.249156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:26.266352 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:26.266372 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:26.337214 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:26.337225 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:26.337235 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:26.407577 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:26.407597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:28.937809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:28.947798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:28.947860 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:28.972642 1201669 cri.go:89] found id: ""
	I1218 00:45:28.972655 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.972662 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:28.972667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:28.972727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:28.997812 1201669 cri.go:89] found id: ""
	I1218 00:45:28.997827 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.997834 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:28.997839 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:28.997897 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:29.025172 1201669 cri.go:89] found id: ""
	I1218 00:45:29.025188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.025195 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:29.025200 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:29.025261 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:29.050129 1201669 cri.go:89] found id: ""
	I1218 00:45:29.050143 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.050151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:29.050156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:29.050216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:29.074056 1201669 cri.go:89] found id: ""
	I1218 00:45:29.074069 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.074076 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:29.074081 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:29.074138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:29.102343 1201669 cri.go:89] found id: ""
	I1218 00:45:29.102356 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.102363 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:29.102369 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:29.102426 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:29.126969 1201669 cri.go:89] found id: ""
	I1218 00:45:29.126982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.126989 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:29.126996 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:29.127007 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:29.201687 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:29.201704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:29.216680 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:29.216696 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:29.290639 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:29.290648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:29.290670 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:29.363990 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:29.364013 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:31.902567 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:31.912532 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:31.912590 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:31.938305 1201669 cri.go:89] found id: ""
	I1218 00:45:31.938319 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.938326 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:31.938331 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:31.938387 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:31.962545 1201669 cri.go:89] found id: ""
	I1218 00:45:31.962558 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.962565 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:31.962570 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:31.962632 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:31.987508 1201669 cri.go:89] found id: ""
	I1218 00:45:31.987521 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.987529 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:31.987534 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:31.987592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:32.014382 1201669 cri.go:89] found id: ""
	I1218 00:45:32.014395 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.014402 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:32.014408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:32.014474 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:32.041186 1201669 cri.go:89] found id: ""
	I1218 00:45:32.041200 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.041207 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:32.041212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:32.041271 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:32.067285 1201669 cri.go:89] found id: ""
	I1218 00:45:32.067308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.067316 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:32.067322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:32.067382 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:32.094234 1201669 cri.go:89] found id: ""
	I1218 00:45:32.094247 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.094254 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:32.094262 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:32.094272 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:32.164781 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:32.164800 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:32.197838 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:32.197854 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:32.268628 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:32.268648 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:32.282984 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:32.283001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:32.352888 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:34.853182 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:34.863312 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:34.863372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:34.887731 1201669 cri.go:89] found id: ""
	I1218 00:45:34.887745 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.887751 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:34.887756 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:34.887813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:34.913433 1201669 cri.go:89] found id: ""
	I1218 00:45:34.913446 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.913453 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:34.913458 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:34.913525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:34.938029 1201669 cri.go:89] found id: ""
	I1218 00:45:34.938043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.938050 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:34.938056 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:34.938125 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:34.963314 1201669 cri.go:89] found id: ""
	I1218 00:45:34.963327 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.963334 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:34.963339 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:34.963395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:34.991684 1201669 cri.go:89] found id: ""
	I1218 00:45:34.991699 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.991706 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:34.991711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:34.991775 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:35.019323 1201669 cri.go:89] found id: ""
	I1218 00:45:35.019338 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.019344 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:35.019350 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:35.019412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:35.044946 1201669 cri.go:89] found id: ""
	I1218 00:45:35.044960 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.044966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:35.044975 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:35.044986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:35.059688 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:35.059704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:35.127679 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:35.127700 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:35.127711 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:35.200793 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:35.200812 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:35.229277 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:35.229293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:37.797709 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:37.807597 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:37.807657 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:37.834367 1201669 cri.go:89] found id: ""
	I1218 00:45:37.834381 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.834399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:37.834404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:37.834466 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:37.862884 1201669 cri.go:89] found id: ""
	I1218 00:45:37.862898 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.862905 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:37.862910 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:37.862967 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:37.887715 1201669 cri.go:89] found id: ""
	I1218 00:45:37.887729 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.887736 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:37.887741 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:37.887800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:37.912412 1201669 cri.go:89] found id: ""
	I1218 00:45:37.912425 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.912432 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:37.912437 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:37.912500 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:37.936195 1201669 cri.go:89] found id: ""
	I1218 00:45:37.936209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.936216 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:37.936250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:37.936308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:37.964630 1201669 cri.go:89] found id: ""
	I1218 00:45:37.964645 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.964658 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:37.964663 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:37.964718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:37.997426 1201669 cri.go:89] found id: ""
	I1218 00:45:37.997439 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.997446 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:37.997454 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:37.997468 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:38.035686 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:38.035710 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:38.103558 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:38.103578 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:38.118520 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:38.118538 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:38.213391 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:38.213399 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:38.213410 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:40.782711 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:40.792421 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:40.792487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:40.816808 1201669 cri.go:89] found id: ""
	I1218 00:45:40.816821 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.816828 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:40.816833 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:40.816889 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:40.842296 1201669 cri.go:89] found id: ""
	I1218 00:45:40.842309 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.842316 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:40.842321 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:40.842381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:40.870550 1201669 cri.go:89] found id: ""
	I1218 00:45:40.870563 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.870570 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:40.870575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:40.870631 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:40.895987 1201669 cri.go:89] found id: ""
	I1218 00:45:40.896000 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.896007 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:40.896012 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:40.896071 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:40.922196 1201669 cri.go:89] found id: ""
	I1218 00:45:40.922209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.922217 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:40.922228 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:40.922287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:40.951012 1201669 cri.go:89] found id: ""
	I1218 00:45:40.951025 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.951032 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:40.951037 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:40.951094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:40.975029 1201669 cri.go:89] found id: ""
	I1218 00:45:40.975043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.975049 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:40.975057 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:40.975068 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:41.038362 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:41.038371 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:41.038383 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:41.106531 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:41.106550 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:41.133380 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:41.133396 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:41.202955 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:41.202974 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.720946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:43.730523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:43.730580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:43.758480 1201669 cri.go:89] found id: ""
	I1218 00:45:43.758494 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.758501 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:43.758506 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:43.758562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:43.782891 1201669 cri.go:89] found id: ""
	I1218 00:45:43.782904 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.782910 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:43.782915 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:43.782969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:43.807881 1201669 cri.go:89] found id: ""
	I1218 00:45:43.807895 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.807901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:43.807906 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:43.807962 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:43.831922 1201669 cri.go:89] found id: ""
	I1218 00:45:43.831934 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.831941 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:43.831946 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:43.832005 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:43.857303 1201669 cri.go:89] found id: ""
	I1218 00:45:43.857316 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.857323 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:43.857328 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:43.857385 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:43.882932 1201669 cri.go:89] found id: ""
	I1218 00:45:43.882945 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.882962 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:43.882967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:43.883034 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:43.910989 1201669 cri.go:89] found id: ""
	I1218 00:45:43.911003 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.911010 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:43.911017 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:43.911027 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:43.976855 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:43.976875 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.992065 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:43.992080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:44.066663 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:44.066673 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:44.066683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:44.136150 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:44.136169 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.674809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:46.685189 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:46.685253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:46.710336 1201669 cri.go:89] found id: ""
	I1218 00:45:46.710350 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.710357 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:46.710362 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:46.710423 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:46.735331 1201669 cri.go:89] found id: ""
	I1218 00:45:46.735344 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.735351 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:46.735356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:46.735412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:46.760113 1201669 cri.go:89] found id: ""
	I1218 00:45:46.760126 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.760133 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:46.760138 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:46.760192 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:46.785212 1201669 cri.go:89] found id: ""
	I1218 00:45:46.785225 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.785231 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:46.785237 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:46.785292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:46.810594 1201669 cri.go:89] found id: ""
	I1218 00:45:46.810607 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.810614 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:46.810619 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:46.810678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:46.835217 1201669 cri.go:89] found id: ""
	I1218 00:45:46.835231 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.835237 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:46.835242 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:46.835300 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:46.859864 1201669 cri.go:89] found id: ""
	I1218 00:45:46.859877 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.859891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:46.859899 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:46.859910 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.887041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:46.887057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:46.953500 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:46.953519 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:46.968086 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:46.968102 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:47.030071 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:47.030081 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:47.030091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.602443 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:49.612708 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:49.612770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:49.638886 1201669 cri.go:89] found id: ""
	I1218 00:45:49.638900 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.638907 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:49.638912 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:49.638969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:49.666118 1201669 cri.go:89] found id: ""
	I1218 00:45:49.666132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.666139 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:49.666145 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:49.666205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:49.695529 1201669 cri.go:89] found id: ""
	I1218 00:45:49.695542 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.695549 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:49.695554 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:49.695609 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:49.718430 1201669 cri.go:89] found id: ""
	I1218 00:45:49.718444 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.718451 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:49.718457 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:49.718514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:49.742944 1201669 cri.go:89] found id: ""
	I1218 00:45:49.742957 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.742964 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:49.742969 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:49.743028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:49.767863 1201669 cri.go:89] found id: ""
	I1218 00:45:49.767876 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.767888 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:49.767894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:49.767949 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:49.792207 1201669 cri.go:89] found id: ""
	I1218 00:45:49.792254 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.792261 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:49.792269 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:49.792279 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:49.806632 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:49.806655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:49.869094 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:49.869105 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:49.869130 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.936480 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:49.936498 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:49.965414 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:49.965430 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.533961 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:52.543970 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:52.544028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:52.569650 1201669 cri.go:89] found id: ""
	I1218 00:45:52.569663 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.569671 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:52.569676 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:52.569735 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:52.593935 1201669 cri.go:89] found id: ""
	I1218 00:45:52.593949 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.593955 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:52.593961 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:52.594019 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:52.618968 1201669 cri.go:89] found id: ""
	I1218 00:45:52.618982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.618989 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:52.618994 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:52.619051 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:52.647696 1201669 cri.go:89] found id: ""
	I1218 00:45:52.647710 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.647717 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:52.647728 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:52.647787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:52.675609 1201669 cri.go:89] found id: ""
	I1218 00:45:52.675622 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.675629 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:52.675634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:52.675690 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:52.701982 1201669 cri.go:89] found id: ""
	I1218 00:45:52.701995 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.702001 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:52.702007 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:52.702064 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:52.727053 1201669 cri.go:89] found id: ""
	I1218 00:45:52.727066 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.727073 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:52.727081 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:52.727091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.793606 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:52.793626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:52.807921 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:52.807938 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:52.871908 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:52.871918 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:52.871942 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:52.939995 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:52.940015 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.467573 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:55.477751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:55.477808 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:55.503215 1201669 cri.go:89] found id: ""
	I1218 00:45:55.503229 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.503235 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:55.503241 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:55.503299 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:55.528321 1201669 cri.go:89] found id: ""
	I1218 00:45:55.528334 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.528341 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:55.528346 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:55.528406 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:55.555566 1201669 cri.go:89] found id: ""
	I1218 00:45:55.555580 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.555586 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:55.555591 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:55.555659 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:55.580858 1201669 cri.go:89] found id: ""
	I1218 00:45:55.580870 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.580877 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:55.580882 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:55.580941 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:55.609703 1201669 cri.go:89] found id: ""
	I1218 00:45:55.609717 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.609724 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:55.609729 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:55.609792 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:55.635271 1201669 cri.go:89] found id: ""
	I1218 00:45:55.635285 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.635301 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:55.635307 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:55.635379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:55.664174 1201669 cri.go:89] found id: ""
	I1218 00:45:55.664188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.664203 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:55.664211 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:55.664247 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:55.678574 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:55.678597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:55.741880 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:55.741890 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:55.741900 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:55.814783 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:55.814804 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.845128 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:55.845151 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.416331 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:58.426299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:58.426355 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:58.457684 1201669 cri.go:89] found id: ""
	I1218 00:45:58.457698 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.457705 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:58.457710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:58.457769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:58.482307 1201669 cri.go:89] found id: ""
	I1218 00:45:58.482320 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.482327 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:58.482332 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:58.482389 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:58.507442 1201669 cri.go:89] found id: ""
	I1218 00:45:58.507454 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.507461 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:58.507466 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:58.507523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:58.536949 1201669 cri.go:89] found id: ""
	I1218 00:45:58.536963 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.536969 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:58.536974 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:58.537030 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:58.565233 1201669 cri.go:89] found id: ""
	I1218 00:45:58.565246 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.565253 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:58.565257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:58.565313 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:58.589568 1201669 cri.go:89] found id: ""
	I1218 00:45:58.589582 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.589589 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:58.589594 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:58.589655 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:58.613117 1201669 cri.go:89] found id: ""
	I1218 00:45:58.613130 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.613137 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:58.613145 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:58.613156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:58.681549 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:58.681572 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:58.709658 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:58.709678 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.778632 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:58.778651 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:58.793209 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:58.793225 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:58.857093 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.358084 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:01.368502 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:01.368561 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:01.399458 1201669 cri.go:89] found id: ""
	I1218 00:46:01.399490 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.399498 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:01.399504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:01.399589 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:01.429331 1201669 cri.go:89] found id: ""
	I1218 00:46:01.429346 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.429353 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:01.429359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:01.429418 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:01.463764 1201669 cri.go:89] found id: ""
	I1218 00:46:01.463777 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.463784 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:01.463792 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:01.463852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:01.490438 1201669 cri.go:89] found id: ""
	I1218 00:46:01.490451 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.490458 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:01.490464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:01.490523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:01.515150 1201669 cri.go:89] found id: ""
	I1218 00:46:01.515163 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.515170 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:01.515176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:01.515238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:01.541480 1201669 cri.go:89] found id: ""
	I1218 00:46:01.541494 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.541501 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:01.541507 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:01.541567 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:01.566788 1201669 cri.go:89] found id: ""
	I1218 00:46:01.566802 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.566809 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:01.566817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:01.566827 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:01.630909 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.630919 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:01.630929 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:01.699339 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:01.699360 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:01.730198 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:01.730213 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:01.798536 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:01.798555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:04.314812 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:04.325258 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:04.325319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:04.350282 1201669 cri.go:89] found id: ""
	I1218 00:46:04.350302 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.350309 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:04.350314 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:04.350374 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:04.375290 1201669 cri.go:89] found id: ""
	I1218 00:46:04.375305 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.375311 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:04.375316 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:04.375381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:04.410898 1201669 cri.go:89] found id: ""
	I1218 00:46:04.410911 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.410918 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:04.410923 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:04.410980 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:04.448128 1201669 cri.go:89] found id: ""
	I1218 00:46:04.448141 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.448151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:04.448156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:04.448214 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:04.478635 1201669 cri.go:89] found id: ""
	I1218 00:46:04.478648 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.478655 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:04.478660 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:04.478718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:04.504261 1201669 cri.go:89] found id: ""
	I1218 00:46:04.504275 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.504282 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:04.504288 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:04.504345 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:04.529823 1201669 cri.go:89] found id: ""
	I1218 00:46:04.529836 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.529843 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:04.529851 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:04.529862 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:04.595056 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:04.595066 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:04.595076 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:04.665580 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:04.665600 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:04.695540 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:04.695555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:04.766700 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:04.766721 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.281438 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:07.291184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:07.291241 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:07.318270 1201669 cri.go:89] found id: ""
	I1218 00:46:07.318283 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.318290 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:07.318295 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:07.318353 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:07.342684 1201669 cri.go:89] found id: ""
	I1218 00:46:07.342697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.342704 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:07.342718 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:07.342777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:07.367159 1201669 cri.go:89] found id: ""
	I1218 00:46:07.367173 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.367180 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:07.367186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:07.367252 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:07.399917 1201669 cri.go:89] found id: ""
	I1218 00:46:07.399942 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.399949 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:07.399954 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:07.400025 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:07.428891 1201669 cri.go:89] found id: ""
	I1218 00:46:07.428904 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.428911 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:07.428918 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:07.428988 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:07.461232 1201669 cri.go:89] found id: ""
	I1218 00:46:07.461244 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.461251 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:07.461257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:07.461319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:07.487577 1201669 cri.go:89] found id: ""
	I1218 00:46:07.487590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.487607 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:07.487616 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:07.487626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:07.554637 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:07.554656 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.570064 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:07.570080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:07.635097 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:07.635107 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:07.635118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:07.706762 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:07.706782 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.235305 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:10.245498 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:10.245568 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:10.275954 1201669 cri.go:89] found id: ""
	I1218 00:46:10.275965 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.275972 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:10.275985 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:10.276042 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:10.301377 1201669 cri.go:89] found id: ""
	I1218 00:46:10.301391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.301397 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:10.301402 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:10.301468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:10.327075 1201669 cri.go:89] found id: ""
	I1218 00:46:10.327089 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.327096 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:10.327101 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:10.327163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:10.355039 1201669 cri.go:89] found id: ""
	I1218 00:46:10.355052 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.355059 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:10.355064 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:10.355126 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:10.380800 1201669 cri.go:89] found id: ""
	I1218 00:46:10.380814 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.380821 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:10.380826 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:10.380883 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:10.420766 1201669 cri.go:89] found id: ""
	I1218 00:46:10.420781 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.420788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:10.420794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:10.420852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:10.450993 1201669 cri.go:89] found id: ""
	I1218 00:46:10.451006 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.451013 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:10.451021 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:10.451031 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:10.469649 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:10.469664 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:10.534853 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:10.534862 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:10.534873 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:10.603061 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:10.603080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.634944 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:10.634961 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.201986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:13.212552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:13.212611 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:13.236454 1201669 cri.go:89] found id: ""
	I1218 00:46:13.236468 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.236475 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:13.236481 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:13.236542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:13.261394 1201669 cri.go:89] found id: ""
	I1218 00:46:13.261408 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.261415 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:13.261420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:13.261479 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:13.286366 1201669 cri.go:89] found id: ""
	I1218 00:46:13.286380 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.286393 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:13.286398 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:13.286457 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:13.311045 1201669 cri.go:89] found id: ""
	I1218 00:46:13.311058 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.311065 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:13.311070 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:13.311132 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:13.336414 1201669 cri.go:89] found id: ""
	I1218 00:46:13.336427 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.336434 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:13.336439 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:13.336503 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:13.366089 1201669 cri.go:89] found id: ""
	I1218 00:46:13.366102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.366109 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:13.366114 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:13.366170 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:13.398167 1201669 cri.go:89] found id: ""
	I1218 00:46:13.398180 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.398187 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:13.398195 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:13.398205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.472148 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:13.472173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:13.487248 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:13.487267 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:13.552950 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:13.552960 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:13.552973 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:13.622039 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:13.622058 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:16.149384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:16.159725 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:16.159786 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:16.186967 1201669 cri.go:89] found id: ""
	I1218 00:46:16.186981 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.186988 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:16.186993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:16.187052 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:16.213347 1201669 cri.go:89] found id: ""
	I1218 00:46:16.213361 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.213368 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:16.213374 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:16.213431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:16.239666 1201669 cri.go:89] found id: ""
	I1218 00:46:16.239679 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.239686 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:16.239692 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:16.239747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:16.264667 1201669 cri.go:89] found id: ""
	I1218 00:46:16.264680 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.264686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:16.264691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:16.264747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:16.290913 1201669 cri.go:89] found id: ""
	I1218 00:46:16.290925 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.290932 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:16.290937 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:16.290995 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:16.318436 1201669 cri.go:89] found id: ""
	I1218 00:46:16.318449 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.318458 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:16.318464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:16.318522 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:16.344303 1201669 cri.go:89] found id: ""
	I1218 00:46:16.344316 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.344323 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:16.344331 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:16.344342 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:16.411796 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:16.411814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:16.427899 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:16.427916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:16.499022 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:16.499032 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:16.499042 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:16.568931 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:16.568951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.102749 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:19.112504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:19.112560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:19.140375 1201669 cri.go:89] found id: ""
	I1218 00:46:19.140389 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.140396 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:19.140401 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:19.140462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:19.170806 1201669 cri.go:89] found id: ""
	I1218 00:46:19.170832 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.170840 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:19.170848 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:19.170930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:19.202879 1201669 cri.go:89] found id: ""
	I1218 00:46:19.202894 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.202901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:19.202907 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:19.202973 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:19.226832 1201669 cri.go:89] found id: ""
	I1218 00:46:19.226844 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.226851 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:19.226856 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:19.226913 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:19.251251 1201669 cri.go:89] found id: ""
	I1218 00:46:19.251264 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.251271 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:19.251277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:19.251334 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:19.275051 1201669 cri.go:89] found id: ""
	I1218 00:46:19.275064 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.275071 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:19.275080 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:19.275138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:19.303255 1201669 cri.go:89] found id: ""
	I1218 00:46:19.303268 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.303291 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:19.303299 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:19.303309 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.332819 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:19.332836 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:19.398262 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:19.398281 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:19.413015 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:19.413030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:19.483412 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:19.483423 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:19.483475 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:22.052118 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:22.062390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:22.062454 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:22.091903 1201669 cri.go:89] found id: ""
	I1218 00:46:22.091917 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.091924 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:22.091930 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:22.091987 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:22.116458 1201669 cri.go:89] found id: ""
	I1218 00:46:22.116471 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.116478 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:22.116483 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:22.116560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:22.142090 1201669 cri.go:89] found id: ""
	I1218 00:46:22.142102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.142109 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:22.142115 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:22.142180 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:22.166148 1201669 cri.go:89] found id: ""
	I1218 00:46:22.166162 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.166169 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:22.166175 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:22.166234 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:22.191864 1201669 cri.go:89] found id: ""
	I1218 00:46:22.191877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.191884 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:22.191890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:22.191953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:22.216176 1201669 cri.go:89] found id: ""
	I1218 00:46:22.216190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.216197 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:22.216202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:22.216283 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:22.240865 1201669 cri.go:89] found id: ""
	I1218 00:46:22.240878 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.240891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:22.240898 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:22.240908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:22.269665 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:22.269688 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:22.334885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:22.334903 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:22.349240 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:22.349256 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:22.424972 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:22.424982 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:22.425001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.004463 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:25.015873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:25.015934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:25.043533 1201669 cri.go:89] found id: ""
	I1218 00:46:25.043547 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.043558 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:25.043563 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:25.043630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:25.070860 1201669 cri.go:89] found id: ""
	I1218 00:46:25.070874 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.070881 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:25.070887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:25.070945 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:25.100326 1201669 cri.go:89] found id: ""
	I1218 00:46:25.100340 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.100349 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:25.100356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:25.100420 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:25.127292 1201669 cri.go:89] found id: ""
	I1218 00:46:25.127306 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.127313 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:25.127318 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:25.127376 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:25.152929 1201669 cri.go:89] found id: ""
	I1218 00:46:25.152943 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.152950 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:25.152955 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:25.153023 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:25.179602 1201669 cri.go:89] found id: ""
	I1218 00:46:25.179622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.179629 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:25.179634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:25.179691 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:25.204777 1201669 cri.go:89] found id: ""
	I1218 00:46:25.204790 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.204797 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:25.204804 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:25.204814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.274359 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:25.274379 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:25.305207 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:25.305224 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:25.375922 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:25.375941 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:25.392181 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:25.392196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:25.470714 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:27.970992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:27.980972 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:27.981029 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:28.007728 1201669 cri.go:89] found id: ""
	I1218 00:46:28.007744 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.007752 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:28.007758 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:28.007821 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:28.038973 1201669 cri.go:89] found id: ""
	I1218 00:46:28.038987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.038995 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:28.039000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:28.039063 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:28.066609 1201669 cri.go:89] found id: ""
	I1218 00:46:28.066622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.066629 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:28.066634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:28.066695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:28.092484 1201669 cri.go:89] found id: ""
	I1218 00:46:28.092498 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.092506 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:28.092512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:28.092583 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:28.119611 1201669 cri.go:89] found id: ""
	I1218 00:46:28.119625 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.119632 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:28.119638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:28.119698 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:28.145154 1201669 cri.go:89] found id: ""
	I1218 00:46:28.145167 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.145175 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:28.145180 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:28.145238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:28.170178 1201669 cri.go:89] found id: ""
	I1218 00:46:28.170191 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.170198 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:28.170206 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:28.170216 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:28.235805 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:28.235824 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:28.250608 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:28.250629 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:28.314678 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:28.314687 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:28.314698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:28.383399 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:28.383420 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:30.924810 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:30.935068 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:30.935128 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:30.960550 1201669 cri.go:89] found id: ""
	I1218 00:46:30.960563 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.960570 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:30.960575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:30.960636 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:30.985705 1201669 cri.go:89] found id: ""
	I1218 00:46:30.985718 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.985725 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:30.985730 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:30.985787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:31.011725 1201669 cri.go:89] found id: ""
	I1218 00:46:31.011739 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.011746 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:31.011751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:31.011813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:31.038735 1201669 cri.go:89] found id: ""
	I1218 00:46:31.038748 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.038755 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:31.038760 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:31.038822 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:31.062623 1201669 cri.go:89] found id: ""
	I1218 00:46:31.062637 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.062645 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:31.062651 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:31.062716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:31.089339 1201669 cri.go:89] found id: ""
	I1218 00:46:31.089353 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.089366 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:31.089372 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:31.089431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:31.119659 1201669 cri.go:89] found id: ""
	I1218 00:46:31.119672 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.119679 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:31.119687 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:31.119698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:31.185677 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:31.185697 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:31.200077 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:31.200092 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:31.263573 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:31.263582 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:31.263593 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:31.331836 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:31.331857 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:33.859870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:33.871250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:33.871309 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:33.899076 1201669 cri.go:89] found id: ""
	I1218 00:46:33.899090 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.899097 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:33.899103 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:33.899163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:33.927937 1201669 cri.go:89] found id: ""
	I1218 00:46:33.927955 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.927961 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:33.927967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:33.928024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:33.954257 1201669 cri.go:89] found id: ""
	I1218 00:46:33.954271 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.954278 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:33.954283 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:33.954339 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:33.978840 1201669 cri.go:89] found id: ""
	I1218 00:46:33.978853 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.978860 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:33.978865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:33.978921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:34.008172 1201669 cri.go:89] found id: ""
	I1218 00:46:34.008186 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.008193 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:34.008198 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:34.008296 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:34.038029 1201669 cri.go:89] found id: ""
	I1218 00:46:34.038043 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.038050 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:34.038057 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:34.038116 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:34.067280 1201669 cri.go:89] found id: ""
	I1218 00:46:34.067294 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.067302 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:34.067311 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:34.067321 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:34.099533 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:34.099549 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:34.165421 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:34.165442 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:34.179966 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:34.179981 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:34.243670 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:34.243681 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:34.243694 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:36.812424 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:36.822427 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:36.822486 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:36.847846 1201669 cri.go:89] found id: ""
	I1218 00:46:36.847859 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.847866 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:36.847872 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:36.847927 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:36.873323 1201669 cri.go:89] found id: ""
	I1218 00:46:36.873337 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.873344 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:36.873349 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:36.873408 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:36.898528 1201669 cri.go:89] found id: ""
	I1218 00:46:36.898541 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.898547 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:36.898553 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:36.898608 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:36.925176 1201669 cri.go:89] found id: ""
	I1218 00:46:36.925190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.925197 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:36.925202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:36.925260 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:36.954449 1201669 cri.go:89] found id: ""
	I1218 00:46:36.954463 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.954469 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:36.954474 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:36.954533 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:36.978226 1201669 cri.go:89] found id: ""
	I1218 00:46:36.978239 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.978246 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:36.978251 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:36.978308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:37.005731 1201669 cri.go:89] found id: ""
	I1218 00:46:37.005747 1201669 logs.go:282] 0 containers: []
	W1218 00:46:37.005755 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:37.005764 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:37.005776 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:37.026584 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:37.026606 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:37.089657 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:37.089672 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:37.089683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:37.161954 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:37.161980 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:37.189136 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:37.189155 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:39.765929 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:39.776452 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:39.776510 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:39.801519 1201669 cri.go:89] found id: ""
	I1218 00:46:39.801532 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.801539 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:39.801544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:39.801604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:39.829201 1201669 cri.go:89] found id: ""
	I1218 00:46:39.829215 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.829222 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:39.829226 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:39.829287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:39.854274 1201669 cri.go:89] found id: ""
	I1218 00:46:39.854287 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.854294 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:39.854299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:39.854357 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:39.879811 1201669 cri.go:89] found id: ""
	I1218 00:46:39.879824 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.879831 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:39.879836 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:39.879893 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:39.912296 1201669 cri.go:89] found id: ""
	I1218 00:46:39.912310 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.912317 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:39.912322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:39.912380 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:39.939288 1201669 cri.go:89] found id: ""
	I1218 00:46:39.939313 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.939321 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:39.939326 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:39.939393 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:39.967012 1201669 cri.go:89] found id: ""
	I1218 00:46:39.967027 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.967034 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:39.967041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:39.967051 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:40.033896 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:40.033919 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:40.052546 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:40.052564 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:40.123489 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:40.123524 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:40.123537 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:40.195140 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:40.195161 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:42.731664 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:42.741511 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:42.741573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:42.765856 1201669 cri.go:89] found id: ""
	I1218 00:46:42.765869 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.765876 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:42.765881 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:42.765947 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:42.790000 1201669 cri.go:89] found id: ""
	I1218 00:46:42.790013 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.790020 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:42.790025 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:42.790080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:42.814497 1201669 cri.go:89] found id: ""
	I1218 00:46:42.814511 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.814518 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:42.814523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:42.814580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:42.839923 1201669 cri.go:89] found id: ""
	I1218 00:46:42.839937 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.839943 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:42.839948 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:42.840009 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:42.866771 1201669 cri.go:89] found id: ""
	I1218 00:46:42.866784 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.866791 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:42.866798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:42.866856 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:42.894391 1201669 cri.go:89] found id: ""
	I1218 00:46:42.894404 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.894411 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:42.894416 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:42.894481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:42.919369 1201669 cri.go:89] found id: ""
	I1218 00:46:42.919391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.919399 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:42.919408 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:42.919419 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:42.934812 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:42.934829 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:42.998153 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:42.998162 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:42.998173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:43.067475 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:43.067494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:43.097319 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:43.097335 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:45.664349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:45.675110 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:45.675171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:45.707428 1201669 cri.go:89] found id: ""
	I1218 00:46:45.707442 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.707449 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:45.707454 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:45.707512 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:45.732673 1201669 cri.go:89] found id: ""
	I1218 00:46:45.732687 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.732694 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:45.732700 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:45.732759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:45.756652 1201669 cri.go:89] found id: ""
	I1218 00:46:45.756666 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.756673 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:45.756679 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:45.756741 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:45.781416 1201669 cri.go:89] found id: ""
	I1218 00:46:45.781430 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.781437 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:45.781442 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:45.781498 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:45.806268 1201669 cri.go:89] found id: ""
	I1218 00:46:45.806281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.806288 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:45.806294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:45.806363 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:45.831015 1201669 cri.go:89] found id: ""
	I1218 00:46:45.831028 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.831035 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:45.831040 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:45.831098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:45.855951 1201669 cri.go:89] found id: ""
	I1218 00:46:45.855964 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.855970 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:45.855978 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:45.855988 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:45.870419 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:45.870436 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:45.934620 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:45.934630 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:45.934641 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:46.007377 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:46.007400 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:46.038285 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:46.038302 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.604685 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:48.614701 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:48.614759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:48.640971 1201669 cri.go:89] found id: ""
	I1218 00:46:48.640984 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.640991 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:48.640997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:48.641055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:48.670241 1201669 cri.go:89] found id: ""
	I1218 00:46:48.670254 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.670261 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:48.670266 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:48.670324 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:48.714267 1201669 cri.go:89] found id: ""
	I1218 00:46:48.714281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.714288 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:48.714294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:48.714359 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:48.743058 1201669 cri.go:89] found id: ""
	I1218 00:46:48.743071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.743077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:48.743083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:48.743146 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:48.768865 1201669 cri.go:89] found id: ""
	I1218 00:46:48.768877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.768885 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:48.768890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:48.768950 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:48.794057 1201669 cri.go:89] found id: ""
	I1218 00:46:48.794071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.794078 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:48.794083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:48.794139 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:48.824069 1201669 cri.go:89] found id: ""
	I1218 00:46:48.824082 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.824090 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:48.824102 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:48.824112 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.893155 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:48.893176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:48.908605 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:48.908621 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:48.974531 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:48.974541 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:48.974551 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:49.047912 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:49.047931 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.578760 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:51.588638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:51.588697 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:51.620629 1201669 cri.go:89] found id: ""
	I1218 00:46:51.620643 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.620649 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:51.620661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:51.620737 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:51.653267 1201669 cri.go:89] found id: ""
	I1218 00:46:51.653281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.653297 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:51.653302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:51.653372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:51.680215 1201669 cri.go:89] found id: ""
	I1218 00:46:51.680250 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.680257 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:51.680263 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:51.680328 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:51.712435 1201669 cri.go:89] found id: ""
	I1218 00:46:51.712448 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.712455 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:51.712460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:51.712525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:51.740973 1201669 cri.go:89] found id: ""
	I1218 00:46:51.740987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.740994 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:51.741000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:51.741057 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:51.765683 1201669 cri.go:89] found id: ""
	I1218 00:46:51.765697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.765704 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:51.765710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:51.765767 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:51.793065 1201669 cri.go:89] found id: ""
	I1218 00:46:51.793080 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.793088 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:51.793095 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:51.793106 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:51.807847 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:51.807863 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:51.870944 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:51.870953 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:51.870964 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:51.939037 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:51.939057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.973517 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:51.973532 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:54.540109 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:54.550150 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:54.550216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:54.579006 1201669 cri.go:89] found id: ""
	I1218 00:46:54.579019 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.579026 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:54.579031 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:54.579088 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:54.609045 1201669 cri.go:89] found id: ""
	I1218 00:46:54.609059 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.609066 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:54.609071 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:54.609130 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:54.640693 1201669 cri.go:89] found id: ""
	I1218 00:46:54.640707 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.640714 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:54.640720 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:54.640777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:54.674577 1201669 cri.go:89] found id: ""
	I1218 00:46:54.674590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.674597 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:54.674603 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:54.674658 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:54.709862 1201669 cri.go:89] found id: ""
	I1218 00:46:54.709875 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.709882 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:54.709887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:54.709946 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:54.735151 1201669 cri.go:89] found id: ""
	I1218 00:46:54.735165 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.735171 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:54.735177 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:54.735237 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:54.762946 1201669 cri.go:89] found id: ""
	I1218 00:46:54.762960 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.762966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:54.762974 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:54.762984 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:54.778250 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:54.778266 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:54.841698 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:54.841707 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:54.841718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:54.909164 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:54.909183 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:54.946219 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:54.946236 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.515189 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:57.525323 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:57.525384 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:57.550694 1201669 cri.go:89] found id: ""
	I1218 00:46:57.550708 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.550716 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:57.550721 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:57.550782 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:57.578567 1201669 cri.go:89] found id: ""
	I1218 00:46:57.578582 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.578590 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:57.578595 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:57.578656 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:57.604092 1201669 cri.go:89] found id: ""
	I1218 00:46:57.604105 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.604112 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:57.604120 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:57.604178 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:57.628719 1201669 cri.go:89] found id: ""
	I1218 00:46:57.628733 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.628739 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:57.628744 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:57.628806 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:57.666872 1201669 cri.go:89] found id: ""
	I1218 00:46:57.666885 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.666892 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:57.666897 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:57.666954 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:57.703636 1201669 cri.go:89] found id: ""
	I1218 00:46:57.703649 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.703656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:57.703661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:57.703721 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:57.729878 1201669 cri.go:89] found id: ""
	I1218 00:46:57.729891 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.729898 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:57.729905 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:57.729916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.793892 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:57.793911 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:57.808664 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:57.808680 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:57.871552 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:57.871570 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:57.871582 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:57.939629 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:57.939649 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:00.470791 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:00.480890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:00.480955 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:00.510265 1201669 cri.go:89] found id: ""
	I1218 00:47:00.510278 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.510285 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:00.510290 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:00.510349 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:00.534908 1201669 cri.go:89] found id: ""
	I1218 00:47:00.534922 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.534929 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:00.534934 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:00.534992 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:00.559619 1201669 cri.go:89] found id: ""
	I1218 00:47:00.559632 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.559639 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:00.559644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:00.559705 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:00.587698 1201669 cri.go:89] found id: ""
	I1218 00:47:00.587711 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.587719 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:00.587724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:00.587781 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:00.611884 1201669 cri.go:89] found id: ""
	I1218 00:47:00.611897 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.611904 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:00.611909 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:00.611974 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:00.640874 1201669 cri.go:89] found id: ""
	I1218 00:47:00.640888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.640895 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:00.640900 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:00.640965 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:00.674185 1201669 cri.go:89] found id: ""
	I1218 00:47:00.674198 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.674205 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:00.674213 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:00.674223 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:00.750327 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:00.750347 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:00.765877 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:00.765899 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:00.831441 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:00.831450 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:00.831462 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:00.899398 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:00.899423 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:03.427398 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:03.437572 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:03.437634 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:03.466927 1201669 cri.go:89] found id: ""
	I1218 00:47:03.466940 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.466948 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:03.466952 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:03.467011 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:03.492647 1201669 cri.go:89] found id: ""
	I1218 00:47:03.492661 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.492668 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:03.492672 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:03.492729 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:03.522689 1201669 cri.go:89] found id: ""
	I1218 00:47:03.522702 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.522709 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:03.522714 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:03.522774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:03.547665 1201669 cri.go:89] found id: ""
	I1218 00:47:03.547679 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.547686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:03.547691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:03.547754 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:03.573125 1201669 cri.go:89] found id: ""
	I1218 00:47:03.573139 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.573146 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:03.573151 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:03.573209 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:03.598799 1201669 cri.go:89] found id: ""
	I1218 00:47:03.598812 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.598819 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:03.598825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:03.598882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:03.622999 1201669 cri.go:89] found id: ""
	I1218 00:47:03.623013 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.623019 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:03.623027 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:03.623037 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:03.697686 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:03.697703 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:03.715817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:03.715833 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:03.782593 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:03.782603 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:03.782616 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:03.850592 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:03.850611 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:06.381230 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:06.390993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:06.391053 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:06.414602 1201669 cri.go:89] found id: ""
	I1218 00:47:06.414616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.414622 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:06.414628 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:06.414684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:06.438729 1201669 cri.go:89] found id: ""
	I1218 00:47:06.438743 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.438750 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:06.438755 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:06.438820 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:06.463196 1201669 cri.go:89] found id: ""
	I1218 00:47:06.463208 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.463215 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:06.463220 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:06.463275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:06.488161 1201669 cri.go:89] found id: ""
	I1218 00:47:06.488174 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.488181 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:06.488186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:06.488275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:06.517546 1201669 cri.go:89] found id: ""
	I1218 00:47:06.517559 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.517566 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:06.517571 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:06.517630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:06.541811 1201669 cri.go:89] found id: ""
	I1218 00:47:06.541825 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.541831 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:06.541837 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:06.541894 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:06.565470 1201669 cri.go:89] found id: ""
	I1218 00:47:06.565483 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.565491 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:06.565501 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:06.565511 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:06.630810 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:06.630828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:06.650036 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:06.650061 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:06.735359 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:06.735369 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:06.735382 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:06.804427 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:06.804447 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.337315 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:09.347711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:09.347770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:09.373796 1201669 cri.go:89] found id: ""
	I1218 00:47:09.373809 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.373817 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:09.373823 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:09.373887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:09.398745 1201669 cri.go:89] found id: ""
	I1218 00:47:09.398759 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.398766 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:09.398783 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:09.398850 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:09.424602 1201669 cri.go:89] found id: ""
	I1218 00:47:09.424616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.424623 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:09.424630 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:09.424687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:09.453853 1201669 cri.go:89] found id: ""
	I1218 00:47:09.453866 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.453873 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:09.453879 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:09.453934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:09.482334 1201669 cri.go:89] found id: ""
	I1218 00:47:09.482348 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.482355 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:09.482360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:09.482415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:09.514905 1201669 cri.go:89] found id: ""
	I1218 00:47:09.514928 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.514935 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:09.514941 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:09.515006 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:09.538866 1201669 cri.go:89] found id: ""
	I1218 00:47:09.538888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.538895 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:09.538903 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:09.538913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:09.553496 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:09.553516 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:09.615452 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:09.615461 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:09.615472 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:09.683616 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:09.683638 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.715893 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:09.715908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.282722 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:12.292327 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:12.292388 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:12.317025 1201669 cri.go:89] found id: ""
	I1218 00:47:12.317039 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.317045 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:12.317050 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:12.317106 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:12.341477 1201669 cri.go:89] found id: ""
	I1218 00:47:12.341490 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.341497 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:12.341501 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:12.341556 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:12.365784 1201669 cri.go:89] found id: ""
	I1218 00:47:12.365798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.365805 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:12.365810 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:12.365870 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:12.394874 1201669 cri.go:89] found id: ""
	I1218 00:47:12.394887 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.394894 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:12.394899 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:12.394958 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:12.419496 1201669 cri.go:89] found id: ""
	I1218 00:47:12.419509 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.419516 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:12.419521 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:12.419577 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:12.444379 1201669 cri.go:89] found id: ""
	I1218 00:47:12.444393 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.444399 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:12.444414 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:12.444470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:12.468918 1201669 cri.go:89] found id: ""
	I1218 00:47:12.468931 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.468939 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:12.468946 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:12.468960 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:12.537486 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:12.537505 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:12.568974 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:12.568990 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.635070 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:12.635089 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:12.652372 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:12.652388 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:12.728630 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:15.228895 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:15.239250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:15.239307 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:15.264983 1201669 cri.go:89] found id: ""
	I1218 00:47:15.264996 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.265003 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:15.265009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:15.265070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:15.293517 1201669 cri.go:89] found id: ""
	I1218 00:47:15.293531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.293537 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:15.293542 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:15.293599 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:15.319218 1201669 cri.go:89] found id: ""
	I1218 00:47:15.319231 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.319238 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:15.319243 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:15.319298 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:15.344396 1201669 cri.go:89] found id: ""
	I1218 00:47:15.344410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.344417 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:15.344422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:15.344481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:15.373243 1201669 cri.go:89] found id: ""
	I1218 00:47:15.373256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.373263 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:15.373268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:15.373329 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:15.397807 1201669 cri.go:89] found id: ""
	I1218 00:47:15.397820 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.397827 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:15.397832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:15.397887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:15.422535 1201669 cri.go:89] found id: ""
	I1218 00:47:15.422549 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.422557 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:15.422564 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:15.422574 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:15.490575 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:15.490595 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:15.521157 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:15.521176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:15.592728 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:15.592747 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:15.607949 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:15.607965 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:15.688565 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.190283 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:18.200009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:18.200073 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:18.224427 1201669 cri.go:89] found id: ""
	I1218 00:47:18.224440 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.224447 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:18.224453 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:18.224514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:18.248627 1201669 cri.go:89] found id: ""
	I1218 00:47:18.248641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.248648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:18.248653 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:18.248711 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:18.275672 1201669 cri.go:89] found id: ""
	I1218 00:47:18.275690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.275703 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:18.275709 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:18.275766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:18.302626 1201669 cri.go:89] found id: ""
	I1218 00:47:18.302640 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.302656 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:18.302661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:18.302716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:18.328772 1201669 cri.go:89] found id: ""
	I1218 00:47:18.328785 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.328792 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:18.328797 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:18.328852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:18.354242 1201669 cri.go:89] found id: ""
	I1218 00:47:18.354256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.354263 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:18.354268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:18.354332 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:18.378135 1201669 cri.go:89] found id: ""
	I1218 00:47:18.378148 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.378157 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:18.378165 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:18.378175 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:18.443885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:18.443904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:18.458116 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:18.458135 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:18.520486 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.520496 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:18.520507 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:18.586967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:18.586986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:21.118235 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:21.128015 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:21.128072 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:21.153715 1201669 cri.go:89] found id: ""
	I1218 00:47:21.153729 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.153736 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:21.153742 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:21.153803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:21.183062 1201669 cri.go:89] found id: ""
	I1218 00:47:21.183075 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.183082 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:21.183087 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:21.183144 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:21.210382 1201669 cri.go:89] found id: ""
	I1218 00:47:21.210396 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.210402 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:21.210407 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:21.210462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:21.235561 1201669 cri.go:89] found id: ""
	I1218 00:47:21.235575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.235582 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:21.235587 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:21.235684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:21.261486 1201669 cri.go:89] found id: ""
	I1218 00:47:21.261500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.261507 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:21.261512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:21.261571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:21.286687 1201669 cri.go:89] found id: ""
	I1218 00:47:21.286701 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.286708 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:21.286713 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:21.286770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:21.312639 1201669 cri.go:89] found id: ""
	I1218 00:47:21.312656 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.312663 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:21.312671 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:21.312682 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:21.377475 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:21.377494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:21.394148 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:21.394166 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:21.461525 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:21.461535 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:21.461546 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:21.529823 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:21.529841 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.060601 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:24.071009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:24.071080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:24.098379 1201669 cri.go:89] found id: ""
	I1218 00:47:24.098392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.098399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:24.098406 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:24.098520 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:24.125402 1201669 cri.go:89] found id: ""
	I1218 00:47:24.125416 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.125423 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:24.125428 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:24.125487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:24.151397 1201669 cri.go:89] found id: ""
	I1218 00:47:24.151410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.151417 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:24.151422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:24.151485 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:24.178459 1201669 cri.go:89] found id: ""
	I1218 00:47:24.178473 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.178480 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:24.178485 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:24.178542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:24.204162 1201669 cri.go:89] found id: ""
	I1218 00:47:24.204175 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.204182 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:24.204188 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:24.204282 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:24.232955 1201669 cri.go:89] found id: ""
	I1218 00:47:24.232969 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.232977 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:24.232982 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:24.233043 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:24.258828 1201669 cri.go:89] found id: ""
	I1218 00:47:24.258841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.258848 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:24.258856 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:24.258867 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.285593 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:24.285609 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:24.352328 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:24.352348 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:24.367078 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:24.367095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:24.430867 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:24.430877 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:24.430887 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.002647 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:27.013860 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:27.013930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:27.042334 1201669 cri.go:89] found id: ""
	I1218 00:47:27.042347 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.042354 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:27.042360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:27.042419 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:27.066697 1201669 cri.go:89] found id: ""
	I1218 00:47:27.066710 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.066717 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:27.066722 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:27.066777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:27.094998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.095011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.095018 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:27.095024 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:27.095081 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:27.122504 1201669 cri.go:89] found id: ""
	I1218 00:47:27.122518 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.122525 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:27.122530 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:27.122587 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:27.147998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.148011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.148018 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:27.148023 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:27.148093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:27.172132 1201669 cri.go:89] found id: ""
	I1218 00:47:27.172149 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.172156 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:27.172161 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:27.172253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:27.197418 1201669 cri.go:89] found id: ""
	I1218 00:47:27.197431 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.197438 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:27.197445 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:27.197455 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:27.263570 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:27.263588 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:27.278312 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:27.278327 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:27.342448 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:27.342458 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:27.342469 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.410881 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:27.410901 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:29.944358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:29.954644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:29.954701 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:29.978633 1201669 cri.go:89] found id: ""
	I1218 00:47:29.978647 1201669 logs.go:282] 0 containers: []
	W1218 00:47:29.978654 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:29.978659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:29.978717 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:30.009832 1201669 cri.go:89] found id: ""
	I1218 00:47:30.009850 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.009858 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:30.009864 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:30.009938 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:30.040840 1201669 cri.go:89] found id: ""
	I1218 00:47:30.040858 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.040867 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:30.040876 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:30.040952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:30.068318 1201669 cri.go:89] found id: ""
	I1218 00:47:30.068332 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.068339 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:30.068344 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:30.068407 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:30.094562 1201669 cri.go:89] found id: ""
	I1218 00:47:30.094577 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.094584 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:30.094589 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:30.094650 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:30.121388 1201669 cri.go:89] found id: ""
	I1218 00:47:30.121402 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.121409 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:30.121415 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:30.121472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:30.149519 1201669 cri.go:89] found id: ""
	I1218 00:47:30.149533 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.149540 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:30.149550 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:30.149565 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:30.177089 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:30.177107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:30.242748 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:30.242767 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:30.257468 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:30.257483 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:30.320728 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:30.320738 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:30.320749 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:32.889870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:32.900811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:32.900868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:32.927540 1201669 cri.go:89] found id: ""
	I1218 00:47:32.927553 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.927560 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:32.927565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:32.927622 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:32.955598 1201669 cri.go:89] found id: ""
	I1218 00:47:32.955611 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.955619 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:32.955623 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:32.955695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:32.979141 1201669 cri.go:89] found id: ""
	I1218 00:47:32.979155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.979162 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:32.979167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:32.979224 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:33.006203 1201669 cri.go:89] found id: ""
	I1218 00:47:33.006218 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.006225 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:33.006230 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:33.006294 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:33.034661 1201669 cri.go:89] found id: ""
	I1218 00:47:33.034675 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.034691 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:33.034697 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:33.034756 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:33.062772 1201669 cri.go:89] found id: ""
	I1218 00:47:33.062786 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.062793 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:33.062804 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:33.062869 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:33.086825 1201669 cri.go:89] found id: ""
	I1218 00:47:33.086839 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.086846 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:33.086871 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:33.086881 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:33.156565 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:33.156585 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:33.185756 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:33.185772 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:33.256648 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:33.256666 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:33.271243 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:33.271259 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:33.337446 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:35.839102 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:35.850275 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:35.850343 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:35.875276 1201669 cri.go:89] found id: ""
	I1218 00:47:35.875289 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.875296 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:35.875301 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:35.875361 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:35.912387 1201669 cri.go:89] found id: ""
	I1218 00:47:35.912400 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.912407 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:35.912412 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:35.912471 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:35.942361 1201669 cri.go:89] found id: ""
	I1218 00:47:35.942379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.942394 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:35.942400 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:35.942499 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:35.972562 1201669 cri.go:89] found id: ""
	I1218 00:47:35.972575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.972584 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:35.972588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:35.972644 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:35.998846 1201669 cri.go:89] found id: ""
	I1218 00:47:35.998861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.998868 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:35.998874 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:35.998952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:36.030184 1201669 cri.go:89] found id: ""
	I1218 00:47:36.030197 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.030213 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:36.030219 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:36.030292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:36.055609 1201669 cri.go:89] found id: ""
	I1218 00:47:36.055624 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.055640 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:36.055648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:36.055658 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:36.128355 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:36.128374 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:36.159887 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:36.159904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:36.229693 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:36.229712 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:36.244397 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:36.244412 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:36.308352 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:38.808637 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:38.819085 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:38.819152 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:38.844745 1201669 cri.go:89] found id: ""
	I1218 00:47:38.844758 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.844766 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:38.844771 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:38.844827 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:38.869442 1201669 cri.go:89] found id: ""
	I1218 00:47:38.869456 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.869463 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:38.869469 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:38.869531 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:38.899129 1201669 cri.go:89] found id: ""
	I1218 00:47:38.899151 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.899158 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:38.899163 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:38.899232 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:38.929158 1201669 cri.go:89] found id: ""
	I1218 00:47:38.929171 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.929178 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:38.929184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:38.929250 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:38.959987 1201669 cri.go:89] found id: ""
	I1218 00:47:38.960016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.960023 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:38.960029 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:38.960093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:38.987077 1201669 cri.go:89] found id: ""
	I1218 00:47:38.987091 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.987098 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:38.987104 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:38.987160 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:39.015227 1201669 cri.go:89] found id: ""
	I1218 00:47:39.015240 1201669 logs.go:282] 0 containers: []
	W1218 00:47:39.015257 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:39.015266 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:39.015278 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:39.044299 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:39.044322 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:39.110657 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:39.110677 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:39.127155 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:39.127171 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:39.195223 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:39.195233 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:39.195243 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:41.762478 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:41.772539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:41.772600 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:41.797940 1201669 cri.go:89] found id: ""
	I1218 00:47:41.797954 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.797961 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:41.797967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:41.798024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:41.823230 1201669 cri.go:89] found id: ""
	I1218 00:47:41.823244 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.823251 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:41.823256 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:41.823314 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:41.848348 1201669 cri.go:89] found id: ""
	I1218 00:47:41.848368 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.848384 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:41.848390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:41.848447 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:41.873186 1201669 cri.go:89] found id: ""
	I1218 00:47:41.873199 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.873207 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:41.873212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:41.873269 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:41.910240 1201669 cri.go:89] found id: ""
	I1218 00:47:41.910253 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.910260 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:41.910265 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:41.910323 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:41.938636 1201669 cri.go:89] found id: ""
	I1218 00:47:41.938649 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.938656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:41.938661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:41.938723 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:41.967011 1201669 cri.go:89] found id: ""
	I1218 00:47:41.967024 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.967031 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:41.967039 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:41.967048 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:42.032273 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:42.032293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:42.047961 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:42.047977 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:42.129763 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:42.129777 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:42.129788 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:42.203638 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:42.203661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:44.747018 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:44.757561 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:44.757666 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:44.782848 1201669 cri.go:89] found id: ""
	I1218 00:47:44.782861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.782868 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:44.782873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:44.782930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:44.812029 1201669 cri.go:89] found id: ""
	I1218 00:47:44.812042 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.812049 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:44.812054 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:44.812111 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:44.835973 1201669 cri.go:89] found id: ""
	I1218 00:47:44.835986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.835994 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:44.835998 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:44.836055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:44.865506 1201669 cri.go:89] found id: ""
	I1218 00:47:44.865524 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.865532 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:44.865539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:44.865596 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:44.895590 1201669 cri.go:89] found id: ""
	I1218 00:47:44.895603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.895610 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:44.895615 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:44.895678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:44.930517 1201669 cri.go:89] found id: ""
	I1218 00:47:44.930531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.930538 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:44.930544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:44.930602 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:44.963147 1201669 cri.go:89] found id: ""
	I1218 00:47:44.963161 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.963168 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:44.963176 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:44.963187 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:45.068693 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:45.068706 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:45.068718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:45.150525 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:45.150547 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:45.198775 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:45.198795 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:45.282633 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:45.282655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:47.798966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:47.809011 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:47.809070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:47.836141 1201669 cri.go:89] found id: ""
	I1218 00:47:47.836155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.836161 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:47.836167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:47.836256 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:47.862554 1201669 cri.go:89] found id: ""
	I1218 00:47:47.862568 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.862575 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:47.862580 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:47.862645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:47.889972 1201669 cri.go:89] found id: ""
	I1218 00:47:47.889986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.889992 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:47.889997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:47.890054 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:47.922142 1201669 cri.go:89] found id: ""
	I1218 00:47:47.922155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.922162 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:47.922168 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:47.922223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:47.956979 1201669 cri.go:89] found id: ""
	I1218 00:47:47.956993 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.956999 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:47.957005 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:47.957062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:47.982938 1201669 cri.go:89] found id: ""
	I1218 00:47:47.982952 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.982959 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:47.982965 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:47.983027 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:48.014164 1201669 cri.go:89] found id: ""
	I1218 00:47:48.014178 1201669 logs.go:282] 0 containers: []
	W1218 00:47:48.014184 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:48.014192 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:48.014205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:48.078819 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:48.078831 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:48.078850 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:48.151018 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:48.151045 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:48.178919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:48.178937 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:48.246806 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:48.246828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:50.762650 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:50.772894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:50.772953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:50.798440 1201669 cri.go:89] found id: ""
	I1218 00:47:50.798453 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.798459 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:50.798468 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:50.798525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:50.824627 1201669 cri.go:89] found id: ""
	I1218 00:47:50.824641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.824648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:50.824654 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:50.824713 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:50.849720 1201669 cri.go:89] found id: ""
	I1218 00:47:50.849732 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.849740 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:50.849745 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:50.849802 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:50.873828 1201669 cri.go:89] found id: ""
	I1218 00:47:50.873841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.873849 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:50.873854 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:50.873910 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:50.905379 1201669 cri.go:89] found id: ""
	I1218 00:47:50.905392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.905399 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:50.905404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:50.905461 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:50.935677 1201669 cri.go:89] found id: ""
	I1218 00:47:50.935690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.935697 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:50.935702 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:50.935774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:50.970057 1201669 cri.go:89] found id: ""
	I1218 00:47:50.970070 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.970077 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:50.970085 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:50.970095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:51.036789 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:51.036810 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:51.051895 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:51.051913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:51.116641 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:51.116651 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:51.116663 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:51.186315 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:51.186337 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:53.718450 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:53.728262 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:53.728318 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:53.755772 1201669 cri.go:89] found id: ""
	I1218 00:47:53.755787 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.755793 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:53.755798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:53.755855 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:53.780839 1201669 cri.go:89] found id: ""
	I1218 00:47:53.780853 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.780860 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:53.780865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:53.780929 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:53.806552 1201669 cri.go:89] found id: ""
	I1218 00:47:53.806603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.806611 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:53.806616 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:53.806672 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:53.832361 1201669 cri.go:89] found id: ""
	I1218 00:47:53.832380 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.832401 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:53.832420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:53.832492 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:53.859241 1201669 cri.go:89] found id: ""
	I1218 00:47:53.859254 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.859262 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:53.859277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:53.859335 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:53.884714 1201669 cri.go:89] found id: ""
	I1218 00:47:53.884728 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.884735 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:53.884740 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:53.884803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:53.921003 1201669 cri.go:89] found id: ""
	I1218 00:47:53.921016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.921024 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:53.921031 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:53.921041 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:54.003954 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:54.003975 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:54.020878 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:54.020896 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:54.086911 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:54.086921 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:54.086943 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:54.157859 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:54.157878 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:56.687608 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:56.697675 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:56.697732 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:56.722032 1201669 cri.go:89] found id: ""
	I1218 00:47:56.722045 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.722053 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:56.722058 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:56.722113 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:56.746685 1201669 cri.go:89] found id: ""
	I1218 00:47:56.746698 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.746705 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:56.746712 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:56.746769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:56.771487 1201669 cri.go:89] found id: ""
	I1218 00:47:56.771500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.771508 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:56.771515 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:56.771571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:56.795765 1201669 cri.go:89] found id: ""
	I1218 00:47:56.795778 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.795785 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:56.795790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:56.795845 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:56.820457 1201669 cri.go:89] found id: ""
	I1218 00:47:56.820470 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.820477 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:56.820482 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:56.820543 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:56.844750 1201669 cri.go:89] found id: ""
	I1218 00:47:56.844764 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.844788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:56.844794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:56.844859 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:56.870299 1201669 cri.go:89] found id: ""
	I1218 00:47:56.870312 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.870319 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:56.870326 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:56.870336 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:56.957977 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:56.957986 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:56.957996 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:57.026903 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:57.026922 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:57.056057 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:57.056072 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:57.122322 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:57.122341 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.637384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:59.647089 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:59.647147 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:59.675785 1201669 cri.go:89] found id: ""
	I1218 00:47:59.675798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.675805 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:59.675811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:59.675868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:59.700863 1201669 cri.go:89] found id: ""
	I1218 00:47:59.700876 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.700883 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:59.700888 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:59.700951 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:59.726366 1201669 cri.go:89] found id: ""
	I1218 00:47:59.726379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.726388 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:59.726394 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:59.726449 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:59.754806 1201669 cri.go:89] found id: ""
	I1218 00:47:59.754819 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.754826 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:59.754832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:59.754887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:59.779823 1201669 cri.go:89] found id: ""
	I1218 00:47:59.779842 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.779850 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:59.779855 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:59.779931 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:59.809497 1201669 cri.go:89] found id: ""
	I1218 00:47:59.809511 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.809519 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:59.809524 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:59.809580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:59.834274 1201669 cri.go:89] found id: ""
	I1218 00:47:59.834287 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.834294 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:59.834302 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:59.834312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:59.908086 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:59.908107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.923555 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:59.923571 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:59.996659 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:59.996668 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:59.996679 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:48:00.245332 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:48:00.245355 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:48:02.854946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:48:02.865088 1201669 kubeadm.go:602] duration metric: took 4m2.280648529s to restartPrimaryControlPlane
	W1218 00:48:02.865154 1201669 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1218 00:48:02.865291 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:48:03.285302 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:48:03.298386 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:48:03.307630 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:48:03.307686 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:48:03.316384 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:48:03.316392 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:48:03.316448 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:48:03.324266 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:48:03.324330 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:48:03.332001 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:48:03.339756 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:48:03.339811 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:48:03.347895 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.356395 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:48:03.356451 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.364239 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:48:03.373496 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:48:03.373555 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:48:03.380932 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:48:03.422222 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:48:03.422277 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:48:03.498554 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:48:03.498619 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:48:03.498653 1201669 kubeadm.go:319] OS: Linux
	I1218 00:48:03.498697 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:48:03.498750 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:48:03.498797 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:48:03.498844 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:48:03.498890 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:48:03.498939 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:48:03.498983 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:48:03.499030 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:48:03.499077 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:48:03.575694 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:48:03.575807 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:48:03.575895 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:48:03.584731 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:48:03.590040 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:48:03.590125 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:48:03.590198 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:48:03.590273 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:48:03.590332 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:48:03.590401 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:48:03.590455 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:48:03.590517 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:48:03.590577 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:48:03.590649 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:48:03.590726 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:48:03.590762 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:48:03.590820 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:48:03.968959 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:48:04.492311 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:48:04.657077 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:48:05.347391 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:48:06.111689 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:48:06.112246 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:48:06.114858 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:48:06.118151 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:48:06.118267 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:48:06.118369 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:48:06.118440 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:48:06.133862 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:48:06.134164 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:48:06.143224 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:48:06.143316 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:48:06.143354 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:48:06.274772 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:48:06.274905 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:52:06.274474 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000113635s
	I1218 00:52:06.274499 1201669 kubeadm.go:319] 
	I1218 00:52:06.274555 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:52:06.274586 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:52:06.274697 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:52:06.274703 1201669 kubeadm.go:319] 
	I1218 00:52:06.274816 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:52:06.274846 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:52:06.274874 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:52:06.274877 1201669 kubeadm.go:319] 
	I1218 00:52:06.279422 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:52:06.279849 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:52:06.279958 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:52:06.280242 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1218 00:52:06.280248 1201669 kubeadm.go:319] 
	I1218 00:52:06.280323 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1218 00:52:06.280425 1201669 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000113635s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1218 00:52:06.280513 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:52:06.687216 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:52:06.699735 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:52:06.699788 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:52:06.707587 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:52:06.707598 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:52:06.707647 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:52:06.715175 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:52:06.715229 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:52:06.722487 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:52:06.729668 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:52:06.729722 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:52:06.736814 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.744131 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:52:06.744183 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.751469 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:52:06.758728 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:52:06.758782 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:52:06.765652 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:52:06.801363 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:52:06.801639 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:52:06.871618 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:52:06.871677 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:52:06.871709 1201669 kubeadm.go:319] OS: Linux
	I1218 00:52:06.871750 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:52:06.871795 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:52:06.871839 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:52:06.871883 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:52:06.871926 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:52:06.871970 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:52:06.872012 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:52:06.872056 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:52:06.872097 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:52:06.943596 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:52:06.943710 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:52:06.943809 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:52:06.952719 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:52:06.957986 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:52:06.958071 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:52:06.958134 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:52:06.958209 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:52:06.958270 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:52:06.958342 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:52:06.958395 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:52:06.958469 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:52:06.958529 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:52:06.958603 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:52:06.958674 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:52:06.958710 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:52:06.958765 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:52:07.159266 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:52:07.543682 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:52:07.621245 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:52:07.789755 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:52:08.258810 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:52:08.259464 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:52:08.262206 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:52:08.265520 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:52:08.265615 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:52:08.265696 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:52:08.266218 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:52:08.282138 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:52:08.282258 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:52:08.290066 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:52:08.290407 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:52:08.290607 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:52:08.422232 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:52:08.422344 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:56:08.423339 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001129518s
	I1218 00:56:08.423364 1201669 kubeadm.go:319] 
	I1218 00:56:08.423420 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:56:08.423452 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:56:08.423565 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:56:08.423570 1201669 kubeadm.go:319] 
	I1218 00:56:08.423755 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:56:08.423825 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:56:08.423872 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:56:08.423876 1201669 kubeadm.go:319] 
	I1218 00:56:08.428596 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:56:08.429049 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:56:08.429151 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:56:08.429380 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 00:56:08.429383 1201669 kubeadm.go:319] 
	I1218 00:56:08.429447 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 00:56:08.429502 1201669 kubeadm.go:403] duration metric: took 12m7.881074518s to StartCluster
	I1218 00:56:08.429533 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:56:08.429592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:56:08.454446 1201669 cri.go:89] found id: ""
	I1218 00:56:08.454459 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.454467 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:56:08.454472 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:56:08.454527 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:56:08.479309 1201669 cri.go:89] found id: ""
	I1218 00:56:08.479323 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.479330 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:56:08.479335 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:56:08.479395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:56:08.506727 1201669 cri.go:89] found id: ""
	I1218 00:56:08.506740 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.506747 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:56:08.506752 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:56:08.506809 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:56:08.531214 1201669 cri.go:89] found id: ""
	I1218 00:56:08.531228 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.531235 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:56:08.531240 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:56:08.531295 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:56:08.555634 1201669 cri.go:89] found id: ""
	I1218 00:56:08.555647 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.555654 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:56:08.555659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:56:08.555716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:56:08.580409 1201669 cri.go:89] found id: ""
	I1218 00:56:08.580423 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.580430 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:56:08.580435 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:56:08.580494 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:56:08.605063 1201669 cri.go:89] found id: ""
	I1218 00:56:08.605089 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.605096 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:56:08.605105 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:56:08.605116 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:56:08.684346 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:56:08.684356 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:56:08.684367 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:56:08.760495 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:56:08.760515 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:56:08.787919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:56:08.787936 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:56:08.853642 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:56:08.853661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	W1218 00:56:08.868901 1201669 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 00:56:08.868939 1201669 out.go:285] * 
	W1218 00:56:08.868999 1201669 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.869015 1201669 out.go:285] * 
	W1218 00:56:08.871456 1201669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:56:08.877860 1201669 out.go:203] 
	W1218 00:56:08.880779 1201669 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.880832 1201669 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 00:56:08.880854 1201669 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 00:56:08.883989 1201669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113118431Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113153129Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113189559Z" level=info msg="Create NRI interface"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113282086Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113290964Z" level=info msg="runtime interface created"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113301647Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113309343Z" level=info msg="runtime interface starting up..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113315505Z" level=info msg="starting plugins..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113327796Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.11339067Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:43:59 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.578897723Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=a394bef7-706e-4c2b-a83c-e7a192425f8f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.579569606Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=0b73d3f0-8cf4-4881-9be6-303c65310a78 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58003914Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=ba617c6c-560d-48a4-8069-49b5cad617df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58069138Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=1b435c90-bcae-4d5e-85b5-8f24b84aad77 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581151364Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=1ca4dc15-0b08-49d0-89ca-728ba68fd7be name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581562446Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=9758cff4-6113-4178-8c9f-4ef34a0e91ee name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581979017Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=34a384cc-3abb-4525-b194-0557e1231baf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:58:38.220074   23463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:38.220773   23463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:38.221726   23463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:38.223186   23463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:38.223661   23463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:58:38 up  7:41,  0 user,  load average: 1.51, 0.48, 0.45
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:58:35 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:36 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2325.
	Dec 18 00:58:36 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:36 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:36 functional-288604 kubelet[23323]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:36 functional-288604 kubelet[23323]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:36 functional-288604 kubelet[23323]: E1218 00:58:36.374294   23323 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:36 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:36 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:37 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2326.
	Dec 18 00:58:37 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:37 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:37 functional-288604 kubelet[23361]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:37 functional-288604 kubelet[23361]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:37 functional-288604 kubelet[23361]: E1218 00:58:37.214781   23361 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:37 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:37 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:37 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2327.
	Dec 18 00:58:37 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:37 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:37 functional-288604 kubelet[23389]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:37 functional-288604 kubelet[23389]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:37 functional-288604 kubelet[23389]: E1218 00:58:37.948182   23389 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:37 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:37 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (343.109492ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd (3.09s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect (2.36s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-288604 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1636: (dbg) Non-zero exit: kubectl --context functional-288604 create deployment hello-node-connect --image kicbase/echo-server: exit status 1 (54.337549ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1638: failed to create hello-node deployment with this command "kubectl --context functional-288604 create deployment hello-node-connect --image kicbase/echo-server": exit status 1.
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-288604 describe po hello-node-connect
functional_test.go:1612: (dbg) Non-zero exit: kubectl --context functional-288604 describe po hello-node-connect: exit status 1 (78.091814ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1614: "kubectl --context functional-288604 describe po hello-node-connect" failed: exit status 1
functional_test.go:1616: hello-node pod describe:
functional_test.go:1618: (dbg) Run:  kubectl --context functional-288604 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-288604 logs -l app=hello-node-connect: exit status 1 (59.840551ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-288604 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-288604 describe svc hello-node-connect
functional_test.go:1624: (dbg) Non-zero exit: kubectl --context functional-288604 describe svc hello-node-connect: exit status 1 (60.747847ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1626: "kubectl --context functional-288604 describe svc hello-node-connect" failed: exit status 1
functional_test.go:1628: hello-node svc describe:
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (311.665869ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                             ARGS                                                                             │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ cache   │ functional-288604 cache reload                                                                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ ssh     │ functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │ 18 Dec 25 00:43 UTC │
	│ kubectl │ functional-288604 kubectl -- --context functional-288604 get pods                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ start   │ -p functional-288604 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:43 UTC │                     │
	│ cp      │ functional-288604 cp testdata/cp-test.txt /home/docker/cp-test.txt                                                                                           │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ config  │ functional-288604 config unset cpus                                                                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ config  │ functional-288604 config get cpus                                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │                     │
	│ config  │ functional-288604 config set cpus 2                                                                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ config  │ functional-288604 config get cpus                                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ config  │ functional-288604 config unset cpus                                                                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ ssh     │ functional-288604 ssh -n functional-288604 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ config  │ functional-288604 config get cpus                                                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │                     │
	│ ssh     │ functional-288604 ssh echo hello                                                                                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ cp      │ functional-288604 cp functional-288604:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelCpCm3792136038/001/cp-test.txt │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ ssh     │ functional-288604 ssh cat /etc/hostname                                                                                                                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ ssh     │ functional-288604 ssh -n functional-288604 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ tunnel  │ functional-288604 tunnel --alsologtostderr                                                                                                                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │                     │
	│ tunnel  │ functional-288604 tunnel --alsologtostderr                                                                                                                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │                     │
	│ cp      │ functional-288604 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt                                                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ tunnel  │ functional-288604 tunnel --alsologtostderr                                                                                                                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │                     │
	│ ssh     │ functional-288604 ssh -n functional-288604 sudo cat /tmp/does/not/exist/cp-test.txt                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:56 UTC │ 18 Dec 25 00:56 UTC │
	│ addons  │ functional-288604 addons list                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ addons  │ functional-288604 addons list -o json                                                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:43:55
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:43:55.978742 1201669 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:43:55.978849 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.978853 1201669 out.go:374] Setting ErrFile to fd 2...
	I1218 00:43:55.978857 1201669 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:43:55.979124 1201669 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:43:55.979466 1201669 out.go:368] Setting JSON to false
	I1218 00:43:55.980315 1201669 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":26784,"bootTime":1765991852,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:43:55.980372 1201669 start.go:143] virtualization:  
	I1218 00:43:55.983789 1201669 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:43:55.987542 1201669 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:43:55.987604 1201669 notify.go:221] Checking for updates...
	I1218 00:43:55.993164 1201669 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:43:55.995954 1201669 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:43:55.999614 1201669 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:43:56.002831 1201669 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:43:56.005802 1201669 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:43:56.009212 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:56.009315 1201669 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:43:56.041210 1201669 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:43:56.041338 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.105588 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.095254501 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.105683 1201669 docker.go:319] overlay module found
	I1218 00:43:56.108792 1201669 out.go:179] * Using the docker driver based on existing profile
	I1218 00:43:56.111628 1201669 start.go:309] selected driver: docker
	I1218 00:43:56.111638 1201669 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.111765 1201669 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:43:56.111873 1201669 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:43:56.170180 1201669 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-18 00:43:56.160520969 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:43:56.170597 1201669 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 00:43:56.170621 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:56.170672 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:56.170715 1201669 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:43:56.173990 1201669 out.go:179] * Starting "functional-288604" primary control-plane node in "functional-288604" cluster
	I1218 00:43:56.177055 1201669 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:43:56.179992 1201669 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:43:56.182847 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:56.182889 1201669 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:43:56.182897 1201669 cache.go:65] Caching tarball of preloaded images
	I1218 00:43:56.182969 1201669 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:43:56.182979 1201669 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 00:43:56.182988 1201669 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:43:56.183103 1201669 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/config.json ...
	I1218 00:43:56.202673 1201669 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 00:43:56.202684 1201669 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 00:43:56.202702 1201669 cache.go:243] Successfully downloaded all kic artifacts
	I1218 00:43:56.202743 1201669 start.go:360] acquireMachinesLock for functional-288604: {Name:mka2ef389e17f81d7cf61339133202b84f644e82 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 00:43:56.202797 1201669 start.go:364] duration metric: took 37.488µs to acquireMachinesLock for "functional-288604"
	I1218 00:43:56.202818 1201669 start.go:96] Skipping create...Using existing machine configuration
	I1218 00:43:56.202823 1201669 fix.go:54] fixHost starting: 
	I1218 00:43:56.203129 1201669 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
	I1218 00:43:56.220546 1201669 fix.go:112] recreateIfNeeded on functional-288604: state=Running err=<nil>
	W1218 00:43:56.220565 1201669 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 00:43:56.223742 1201669 out.go:252] * Updating the running docker "functional-288604" container ...
	I1218 00:43:56.223770 1201669 machine.go:94] provisionDockerMachine start ...
	I1218 00:43:56.223861 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.243517 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.243858 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.243865 1201669 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 00:43:56.399607 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.399622 1201669 ubuntu.go:182] provisioning hostname "functional-288604"
	I1218 00:43:56.399683 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.417287 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.417598 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.417605 1201669 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-288604 && echo "functional-288604" | sudo tee /etc/hostname
	I1218 00:43:56.583098 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-288604
	
	I1218 00:43:56.583184 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:56.603369 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:56.603669 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:56.603683 1201669 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-288604' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-288604/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-288604' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 00:43:56.772929 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 00:43:56.772944 1201669 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 00:43:56.772975 1201669 ubuntu.go:190] setting up certificates
	I1218 00:43:56.772989 1201669 provision.go:84] configureAuth start
	I1218 00:43:56.773070 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:56.789980 1201669 provision.go:143] copyHostCerts
	I1218 00:43:56.790044 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 00:43:56.790056 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 00:43:56.790131 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 00:43:56.790231 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 00:43:56.790235 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 00:43:56.790260 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 00:43:56.790310 1201669 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 00:43:56.790313 1201669 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 00:43:56.790335 1201669 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 00:43:56.790376 1201669 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.functional-288604 san=[127.0.0.1 192.168.49.2 functional-288604 localhost minikube]
	I1218 00:43:56.986120 1201669 provision.go:177] copyRemoteCerts
	I1218 00:43:56.986182 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 00:43:56.986224 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.010906 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.115839 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 00:43:57.132835 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 00:43:57.150663 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1218 00:43:57.167535 1201669 provision.go:87] duration metric: took 394.523589ms to configureAuth
	I1218 00:43:57.167552 1201669 ubuntu.go:206] setting minikube options for container-runtime
	I1218 00:43:57.167745 1201669 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:43:57.167846 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.184649 1201669 main.go:143] libmachine: Using SSH client type: native
	I1218 00:43:57.184955 1201669 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 33925 <nil> <nil>}
	I1218 00:43:57.184966 1201669 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 00:43:57.547661 1201669 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 00:43:57.547677 1201669 machine.go:97] duration metric: took 1.323900056s to provisionDockerMachine
	I1218 00:43:57.547689 1201669 start.go:293] postStartSetup for "functional-288604" (driver="docker")
	I1218 00:43:57.547701 1201669 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 00:43:57.547767 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 00:43:57.547816 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.568532 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.675839 1201669 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 00:43:57.679095 1201669 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 00:43:57.679112 1201669 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 00:43:57.679121 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 00:43:57.679176 1201669 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 00:43:57.679251 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 00:43:57.679324 1201669 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts -> hosts in /etc/test/nested/copy/1159552
	I1218 00:43:57.679367 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1159552
	I1218 00:43:57.686719 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:57.703522 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts --> /etc/test/nested/copy/1159552/hosts (40 bytes)
	I1218 00:43:57.720871 1201669 start.go:296] duration metric: took 173.166293ms for postStartSetup
	I1218 00:43:57.720943 1201669 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 00:43:57.720983 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.737854 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.841489 1201669 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 00:43:57.846519 1201669 fix.go:56] duration metric: took 1.643688341s for fixHost
	I1218 00:43:57.846534 1201669 start.go:83] releasing machines lock for "functional-288604", held for 1.6437309s
	I1218 00:43:57.846614 1201669 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-288604
	I1218 00:43:57.862813 1201669 ssh_runner.go:195] Run: cat /version.json
	I1218 00:43:57.862836 1201669 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 00:43:57.862859 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.862906 1201669 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
	I1218 00:43:57.880942 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.881296 1201669 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
	I1218 00:43:57.984097 1201669 ssh_runner.go:195] Run: systemctl --version
	I1218 00:43:58.077458 1201669 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 00:43:58.117786 1201669 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 00:43:58.128203 1201669 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 00:43:58.128283 1201669 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 00:43:58.137853 1201669 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 00:43:58.137867 1201669 start.go:496] detecting cgroup driver to use...
	I1218 00:43:58.137898 1201669 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 00:43:58.137955 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 00:43:58.154333 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 00:43:58.171243 1201669 docker.go:218] disabling cri-docker service (if available) ...
	I1218 00:43:58.171317 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 00:43:58.187629 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 00:43:58.200443 1201669 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 00:43:58.332309 1201669 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 00:43:58.456320 1201669 docker.go:234] disabling docker service ...
	I1218 00:43:58.456386 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 00:43:58.471261 1201669 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 00:43:58.484090 1201669 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 00:43:58.600872 1201669 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 00:43:58.712059 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 00:43:58.725312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 00:43:58.738398 1201669 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 00:43:58.738467 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.746850 1201669 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 00:43:58.746917 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.755273 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.763400 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.771727 1201669 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 00:43:58.779324 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.788210 1201669 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.796348 1201669 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 00:43:58.804389 1201669 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 00:43:58.811403 1201669 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 00:43:58.818408 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:58.951912 1201669 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 00:43:59.118783 1201669 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 00:43:59.118849 1201669 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 00:43:59.122545 1201669 start.go:564] Will wait 60s for crictl version
	I1218 00:43:59.122604 1201669 ssh_runner.go:195] Run: which crictl
	I1218 00:43:59.126019 1201669 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 00:43:59.148982 1201669 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 00:43:59.149067 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.175940 1201669 ssh_runner.go:195] Run: crio --version
	I1218 00:43:59.206912 1201669 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 00:43:59.209698 1201669 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 00:43:59.225649 1201669 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1218 00:43:59.232549 1201669 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1218 00:43:59.235431 1201669 kubeadm.go:884] updating cluster {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disabl
eOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 00:43:59.235543 1201669 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:43:59.235614 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.273407 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.273418 1201669 crio.go:433] Images already preloaded, skipping extraction
	I1218 00:43:59.273471 1201669 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 00:43:59.299275 1201669 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 00:43:59.299287 1201669 cache_images.go:86] Images are preloaded, skipping loading
	I1218 00:43:59.299293 1201669 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 crio true true} ...
	I1218 00:43:59.299404 1201669 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=functional-288604 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 00:43:59.299490 1201669 ssh_runner.go:195] Run: crio config
	I1218 00:43:59.362084 1201669 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1218 00:43:59.362106 1201669 cni.go:84] Creating CNI manager for ""
	I1218 00:43:59.362113 1201669 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:43:59.362126 1201669 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 00:43:59.362149 1201669 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-288604 NodeName:functional-288604 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOp
ts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 00:43:59.362277 1201669 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "functional-288604"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 00:43:59.362352 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 00:43:59.369805 1201669 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 00:43:59.369864 1201669 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 00:43:59.376968 1201669 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (372 bytes)
	I1218 00:43:59.388765 1201669 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 00:43:59.400454 1201669 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2069 bytes)
	I1218 00:43:59.412514 1201669 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1218 00:43:59.416040 1201669 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 00:43:59.531606 1201669 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 00:43:59.640794 1201669 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604 for IP: 192.168.49.2
	I1218 00:43:59.640805 1201669 certs.go:195] generating shared ca certs ...
	I1218 00:43:59.640830 1201669 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:43:59.640959 1201669 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 00:43:59.641001 1201669 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 00:43:59.641007 1201669 certs.go:257] generating profile certs ...
	I1218 00:43:59.641121 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.key
	I1218 00:43:59.641164 1201669 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key.9182ce28
	I1218 00:43:59.641201 1201669 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key
	I1218 00:43:59.641309 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 00:43:59.641337 1201669 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 00:43:59.641343 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 00:43:59.641373 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 00:43:59.641395 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 00:43:59.641423 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 00:43:59.641463 1201669 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 00:43:59.642073 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 00:43:59.660992 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 00:43:59.679818 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 00:43:59.699150 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 00:43:59.718895 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1218 00:43:59.738413 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 00:43:59.756315 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 00:43:59.773826 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 00:43:59.791059 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 00:43:59.807447 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 00:43:59.824212 1201669 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 00:43:59.841186 1201669 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 00:43:59.853492 1201669 ssh_runner.go:195] Run: openssl version
	I1218 00:43:59.859998 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.866869 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 00:43:59.873885 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877278 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.877331 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 00:43:59.917714 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 00:43:59.925047 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.932048 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 00:43:59.939101 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942813 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.942866 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 00:43:59.983421 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 00:43:59.990593 1201669 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 00:43:59.997725 1201669 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 00:44:00.042943 1201669 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059312 1201669 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.059393 1201669 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 00:44:00.179416 1201669 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 00:44:00.199517 1201669 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 00:44:00.211411 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 00:44:00.299862 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 00:44:00.347783 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 00:44:00.400161 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 00:44:00.445236 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 00:44:00.505288 1201669 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 00:44:00.548440 1201669 kubeadm.go:401] StartCluster: {Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:44:00.548538 1201669 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 00:44:00.548659 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.576531 1201669 cri.go:89] found id: ""
	I1218 00:44:00.576602 1201669 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 00:44:00.584414 1201669 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 00:44:00.584430 1201669 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 00:44:00.584481 1201669 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 00:44:00.591678 1201669 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.592197 1201669 kubeconfig.go:125] found "functional-288604" server: "https://192.168.49.2:8441"
	I1218 00:44:00.593407 1201669 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 00:44:00.601066 1201669 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-18 00:29:23.211763247 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-18 00:43:59.405160305 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1218 00:44:00.601075 1201669 kubeadm.go:1161] stopping kube-system containers ...
	I1218 00:44:00.601085 1201669 cri.go:54] listing CRI containers in root : {State:all Name: Namespaces:[kube-system]}
	I1218 00:44:00.601140 1201669 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 00:44:00.626991 1201669 cri.go:89] found id: ""
	I1218 00:44:00.627065 1201669 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1218 00:44:00.640495 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:44:00.648256 1201669 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 18 00:33 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec 18 00:33 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 18 00:33 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 18 00:33 /etc/kubernetes/scheduler.conf
	
	I1218 00:44:00.648311 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:44:00.655772 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:44:00.663347 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.663410 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:44:00.670748 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.677977 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.678031 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:44:00.685079 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:44:00.692996 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 00:44:00.693049 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:44:00.700106 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:44:00.707647 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:00.751682 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:01.971643 1201669 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.219916809s)
	I1218 00:44:01.971736 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.213563 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.279593 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1218 00:44:02.331094 1201669 api_server.go:52] waiting for apiserver process to appear ...
	I1218 00:44:02.331177 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:02.831338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.332205 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:03.831381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.331525 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:04.832325 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.331379 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:05.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.332243 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:06.831869 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.331354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:07.831326 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.331942 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:08.831354 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.331370 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:09.832255 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.331366 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:10.831363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:11.831359 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.331357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:12.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.331990 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:13.831891 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.331340 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:14.832123 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.331341 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:15.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.332060 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:16.831352 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.331755 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:17.831466 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.331860 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:18.831293 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.332008 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:19.831369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.331585 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:20.832126 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.331328 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:21.831986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.331369 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:22.831627 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.331975 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:23.831268 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.331992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:24.831394 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.331896 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:25.831502 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.331383 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:26.831706 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.332082 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:27.831353 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.331380 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:28.832133 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.331347 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:29.831351 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.332001 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:30.831800 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.331774 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:31.831372 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.332276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:32.832017 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.331329 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:33.832065 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.331713 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:34.831374 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.331600 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:35.831577 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.332164 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:36.831455 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.331933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:37.831358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.332063 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:38.831460 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.331554 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:39.832152 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.331280 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:40.831272 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.332273 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:41.832020 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.331662 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:42.831758 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.331412 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:43.831371 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.332088 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:44.831480 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.332490 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:45.832201 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.331816 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:46.831276 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.331408 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:47.831739 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.331262 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:48.831814 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.332083 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:49.832108 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.331984 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:50.831507 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.331363 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:51.831505 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.332120 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:52.831384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.332279 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:53.831590 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.331361 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:54.831933 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.331338 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:55.831357 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.332254 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:56.832148 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.331950 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:57.831349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.332302 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:58.832264 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.331912 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:44:59.832145 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.331498 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:00.831848 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.331497 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:01.831406 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:02.332289 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:02.332395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:02.358402 1201669 cri.go:89] found id: ""
	I1218 00:45:02.358416 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.358424 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:02.358429 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:02.358493 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:02.386799 1201669 cri.go:89] found id: ""
	I1218 00:45:02.386814 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.386821 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:02.386825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:02.386882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:02.419430 1201669 cri.go:89] found id: ""
	I1218 00:45:02.419445 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.419453 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:02.419460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:02.419560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:02.445313 1201669 cri.go:89] found id: ""
	I1218 00:45:02.445326 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.445333 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:02.445338 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:02.445395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:02.474189 1201669 cri.go:89] found id: ""
	I1218 00:45:02.474203 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.474210 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:02.474215 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:02.474278 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:02.501782 1201669 cri.go:89] found id: ""
	I1218 00:45:02.501796 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.501803 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:02.501808 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:02.501867 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:02.531648 1201669 cri.go:89] found id: ""
	I1218 00:45:02.531662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:02.531669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:02.531677 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:02.531690 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:02.597077 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:02.597095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:02.612827 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:02.612845 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:02.680833 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:02.672362   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.673058   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.674821   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.675194   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:02.676720   11025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:02.680844 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:02.680855 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:02.749861 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:02.749884 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:05.287966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:05.298109 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:05.298171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:05.323714 1201669 cri.go:89] found id: ""
	I1218 00:45:05.323727 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.323733 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:05.323739 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:05.323800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:05.348520 1201669 cri.go:89] found id: ""
	I1218 00:45:05.348534 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.348541 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:05.348546 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:05.348604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:05.373275 1201669 cri.go:89] found id: ""
	I1218 00:45:05.373290 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.373297 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:05.373302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:05.373362 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:05.397833 1201669 cri.go:89] found id: ""
	I1218 00:45:05.397846 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.397853 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:05.397859 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:05.397921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:05.422938 1201669 cri.go:89] found id: ""
	I1218 00:45:05.422952 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.422959 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:05.422964 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:05.423026 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:05.451027 1201669 cri.go:89] found id: ""
	I1218 00:45:05.451041 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.451048 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:05.451053 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:05.451115 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:05.477082 1201669 cri.go:89] found id: ""
	I1218 00:45:05.477096 1201669 logs.go:282] 0 containers: []
	W1218 00:45:05.477102 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:05.477110 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:05.477120 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:05.543065 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:05.543083 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:05.558032 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:05.558047 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:05.623058 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:05.613129   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.615468   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.616309   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.617841   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:05.618302   11129 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:05.623071 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:05.623081 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:05.694967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:05.694987 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.224381 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:08.234565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:08.234639 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:08.262640 1201669 cri.go:89] found id: ""
	I1218 00:45:08.262654 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.262661 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:08.262667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:08.262724 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:08.288384 1201669 cri.go:89] found id: ""
	I1218 00:45:08.288397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.288404 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:08.288409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:08.288468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:08.314880 1201669 cri.go:89] found id: ""
	I1218 00:45:08.314893 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.314900 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:08.314911 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:08.314971 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:08.340105 1201669 cri.go:89] found id: ""
	I1218 00:45:08.340119 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.340125 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:08.340131 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:08.340202 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:08.370009 1201669 cri.go:89] found id: ""
	I1218 00:45:08.370023 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.370030 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:08.370035 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:08.370094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:08.394925 1201669 cri.go:89] found id: ""
	I1218 00:45:08.394939 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.394946 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:08.394951 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:08.395013 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:08.419448 1201669 cri.go:89] found id: ""
	I1218 00:45:08.419462 1201669 logs.go:282] 0 containers: []
	W1218 00:45:08.419469 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:08.419477 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:08.419487 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:08.493271 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:08.493290 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:08.521236 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:08.521251 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:08.591011 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:08.591030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:08.605700 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:08.605716 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:08.674615 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:08.666225   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.666910   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.668550   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.669135   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:08.670796   11247 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.175336 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:11.186731 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:11.186790 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:11.217495 1201669 cri.go:89] found id: ""
	I1218 00:45:11.217510 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.217517 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:11.217522 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:11.217579 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:11.242494 1201669 cri.go:89] found id: ""
	I1218 00:45:11.242506 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.242514 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:11.242519 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:11.242588 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:11.269562 1201669 cri.go:89] found id: ""
	I1218 00:45:11.269576 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.269583 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:11.269588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:11.269646 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:11.296483 1201669 cri.go:89] found id: ""
	I1218 00:45:11.296497 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.296503 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:11.296517 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:11.296573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:11.324023 1201669 cri.go:89] found id: ""
	I1218 00:45:11.324037 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.324044 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:11.324049 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:11.324107 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:11.350813 1201669 cri.go:89] found id: ""
	I1218 00:45:11.350826 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.350833 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:11.350838 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:11.350915 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:11.375508 1201669 cri.go:89] found id: ""
	I1218 00:45:11.375522 1201669 logs.go:282] 0 containers: []
	W1218 00:45:11.375529 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:11.375538 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:11.375548 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:11.443170 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:11.443196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:11.458193 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:11.458209 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:11.526119 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:11.517992   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.518912   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.520453   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.521113   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:11.522366   11343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:11.526129 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:11.526139 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:11.598390 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:11.598409 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:14.127470 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:14.140176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:14.140248 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:14.175467 1201669 cri.go:89] found id: ""
	I1218 00:45:14.175481 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.175488 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:14.175493 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:14.175550 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:14.205623 1201669 cri.go:89] found id: ""
	I1218 00:45:14.205637 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.205649 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:14.205655 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:14.205727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:14.232765 1201669 cri.go:89] found id: ""
	I1218 00:45:14.232779 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.232786 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:14.232790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:14.232848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:14.259382 1201669 cri.go:89] found id: ""
	I1218 00:45:14.259396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.259403 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:14.259408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:14.259465 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:14.284118 1201669 cri.go:89] found id: ""
	I1218 00:45:14.284132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.284139 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:14.284144 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:14.284205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:14.308510 1201669 cri.go:89] found id: ""
	I1218 00:45:14.308530 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.308536 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:14.308552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:14.308619 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:14.336798 1201669 cri.go:89] found id: ""
	I1218 00:45:14.336811 1201669 logs.go:282] 0 containers: []
	W1218 00:45:14.336819 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:14.336826 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:14.336837 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:14.402054 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:14.402074 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:14.416289 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:14.416306 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:14.480242 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:14.472692   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.473192   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.474645   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.475056   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:14.476501   11452 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:14.480255 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:14.480265 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:14.549733 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:14.549753 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:17.078515 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:17.088248 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:17.088306 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:17.112977 1201669 cri.go:89] found id: ""
	I1218 00:45:17.112990 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.112998 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:17.113004 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:17.113062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:17.137141 1201669 cri.go:89] found id: ""
	I1218 00:45:17.137154 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.137161 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:17.137167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:17.137223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:17.166013 1201669 cri.go:89] found id: ""
	I1218 00:45:17.166026 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.166033 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:17.166038 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:17.166098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:17.194884 1201669 cri.go:89] found id: ""
	I1218 00:45:17.194906 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.194920 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:17.194925 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:17.194990 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:17.220329 1201669 cri.go:89] found id: ""
	I1218 00:45:17.220342 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.220349 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:17.220354 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:17.220415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:17.248333 1201669 cri.go:89] found id: ""
	I1218 00:45:17.248347 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.248353 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:17.248359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:17.248415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:17.273057 1201669 cri.go:89] found id: ""
	I1218 00:45:17.273074 1201669 logs.go:282] 0 containers: []
	W1218 00:45:17.273084 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:17.273093 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:17.273104 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:17.339448 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:17.339467 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:17.354635 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:17.354652 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:17.422682 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:17.414648   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.415260   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.416964   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.417429   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:17.418795   11560 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:17.422703 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:17.422714 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:17.490930 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:17.490951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.021992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:20.032625 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:20.032687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:20.060698 1201669 cri.go:89] found id: ""
	I1218 00:45:20.060712 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.060719 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:20.060724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:20.060785 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:20.086680 1201669 cri.go:89] found id: ""
	I1218 00:45:20.086694 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.086701 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:20.086706 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:20.086766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:20.112553 1201669 cri.go:89] found id: ""
	I1218 00:45:20.112567 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.112574 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:20.112579 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:20.112642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:20.137056 1201669 cri.go:89] found id: ""
	I1218 00:45:20.137070 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.137077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:20.137082 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:20.137148 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:20.175745 1201669 cri.go:89] found id: ""
	I1218 00:45:20.175758 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.175775 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:20.175780 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:20.175848 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:20.205557 1201669 cri.go:89] found id: ""
	I1218 00:45:20.205570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.205578 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:20.205583 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:20.205645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:20.234726 1201669 cri.go:89] found id: ""
	I1218 00:45:20.234739 1201669 logs.go:282] 0 containers: []
	W1218 00:45:20.234746 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:20.234754 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:20.234773 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:20.303025 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:20.303044 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:20.331096 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:20.331118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:20.397831 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:20.397856 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:20.412745 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:20.412761 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:20.480267 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:20.471919   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.472652   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474391   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.474986   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:20.476493   11676 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:22.980543 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:22.990690 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:22.990747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:23.016762 1201669 cri.go:89] found id: ""
	I1218 00:45:23.016795 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.016802 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:23.016807 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:23.016868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:23.042294 1201669 cri.go:89] found id: ""
	I1218 00:45:23.042308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.042315 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:23.042320 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:23.042379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:23.071377 1201669 cri.go:89] found id: ""
	I1218 00:45:23.071392 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.071399 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:23.071405 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:23.071463 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:23.096911 1201669 cri.go:89] found id: ""
	I1218 00:45:23.096925 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.096932 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:23.096938 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:23.097002 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:23.123350 1201669 cri.go:89] found id: ""
	I1218 00:45:23.123363 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.123370 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:23.123375 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:23.123455 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:23.156384 1201669 cri.go:89] found id: ""
	I1218 00:45:23.156397 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.156404 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:23.156409 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:23.156470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:23.198764 1201669 cri.go:89] found id: ""
	I1218 00:45:23.198777 1201669 logs.go:282] 0 containers: []
	W1218 00:45:23.198784 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:23.198792 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:23.198802 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:23.276991 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:23.277016 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:23.305929 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:23.305946 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:23.374243 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:23.374263 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:23.389391 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:23.389408 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:23.455741 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:23.447433   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.447960   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.449531   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.450195   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:23.451856   11780 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:25.956010 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:25.966329 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:25.966402 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:25.992362 1201669 cri.go:89] found id: ""
	I1218 00:45:25.992376 1201669 logs.go:282] 0 containers: []
	W1218 00:45:25.992383 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:25.992388 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:25.992446 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:26.020474 1201669 cri.go:89] found id: ""
	I1218 00:45:26.020487 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.020495 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:26.020500 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:26.020562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:26.053060 1201669 cri.go:89] found id: ""
	I1218 00:45:26.053083 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.053090 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:26.053096 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:26.053168 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:26.080555 1201669 cri.go:89] found id: ""
	I1218 00:45:26.080570 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.080577 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:26.080582 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:26.080642 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:26.106383 1201669 cri.go:89] found id: ""
	I1218 00:45:26.106396 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.106405 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:26.106413 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:26.106472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:26.133033 1201669 cri.go:89] found id: ""
	I1218 00:45:26.133046 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.133053 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:26.133059 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:26.133114 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:26.166644 1201669 cri.go:89] found id: ""
	I1218 00:45:26.166662 1201669 logs.go:282] 0 containers: []
	W1218 00:45:26.166669 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:26.166683 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:26.166693 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:26.249137 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:26.249156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:26.266352 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:26.266372 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:26.337214 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:26.327799   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.328400   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.330240   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.331061   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:26.332033   11876 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:26.337225 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:26.337235 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:26.407577 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:26.407597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:28.937809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:28.947798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:28.947860 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:28.972642 1201669 cri.go:89] found id: ""
	I1218 00:45:28.972655 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.972662 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:28.972667 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:28.972727 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:28.997812 1201669 cri.go:89] found id: ""
	I1218 00:45:28.997827 1201669 logs.go:282] 0 containers: []
	W1218 00:45:28.997834 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:28.997839 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:28.997897 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:29.025172 1201669 cri.go:89] found id: ""
	I1218 00:45:29.025188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.025195 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:29.025200 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:29.025261 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:29.050129 1201669 cri.go:89] found id: ""
	I1218 00:45:29.050143 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.050151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:29.050156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:29.050216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:29.074056 1201669 cri.go:89] found id: ""
	I1218 00:45:29.074069 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.074076 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:29.074081 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:29.074138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:29.102343 1201669 cri.go:89] found id: ""
	I1218 00:45:29.102356 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.102363 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:29.102369 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:29.102426 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:29.126969 1201669 cri.go:89] found id: ""
	I1218 00:45:29.126982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:29.126989 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:29.126996 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:29.127007 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:29.201687 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:29.201704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:29.216680 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:29.216696 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:29.290639 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:29.275612   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.276342   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.283714   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.284844   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:29.285463   11983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:29.290648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:29.290670 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:29.363990 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:29.364013 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:31.902567 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:31.912532 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:31.912590 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:31.938305 1201669 cri.go:89] found id: ""
	I1218 00:45:31.938319 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.938326 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:31.938331 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:31.938387 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:31.962545 1201669 cri.go:89] found id: ""
	I1218 00:45:31.962558 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.962565 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:31.962570 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:31.962632 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:31.987508 1201669 cri.go:89] found id: ""
	I1218 00:45:31.987521 1201669 logs.go:282] 0 containers: []
	W1218 00:45:31.987529 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:31.987534 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:31.987592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:32.014382 1201669 cri.go:89] found id: ""
	I1218 00:45:32.014395 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.014402 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:32.014408 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:32.014474 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:32.041186 1201669 cri.go:89] found id: ""
	I1218 00:45:32.041200 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.041207 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:32.041212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:32.041271 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:32.067285 1201669 cri.go:89] found id: ""
	I1218 00:45:32.067308 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.067316 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:32.067322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:32.067382 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:32.094234 1201669 cri.go:89] found id: ""
	I1218 00:45:32.094247 1201669 logs.go:282] 0 containers: []
	W1218 00:45:32.094254 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:32.094262 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:32.094272 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:32.164781 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:32.164800 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:32.197838 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:32.197854 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:32.268628 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:32.268648 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:32.282984 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:32.283001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:32.352888 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:32.344646   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.345274   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.346770   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.347294   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:32.348798   12101 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:34.853182 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:34.863312 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:34.863372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:34.887731 1201669 cri.go:89] found id: ""
	I1218 00:45:34.887745 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.887751 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:34.887756 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:34.887813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:34.913433 1201669 cri.go:89] found id: ""
	I1218 00:45:34.913446 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.913453 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:34.913458 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:34.913525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:34.938029 1201669 cri.go:89] found id: ""
	I1218 00:45:34.938043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.938050 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:34.938056 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:34.938125 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:34.963314 1201669 cri.go:89] found id: ""
	I1218 00:45:34.963327 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.963334 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:34.963339 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:34.963395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:34.991684 1201669 cri.go:89] found id: ""
	I1218 00:45:34.991699 1201669 logs.go:282] 0 containers: []
	W1218 00:45:34.991706 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:34.991711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:34.991775 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:35.019323 1201669 cri.go:89] found id: ""
	I1218 00:45:35.019338 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.019344 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:35.019350 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:35.019412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:35.044946 1201669 cri.go:89] found id: ""
	I1218 00:45:35.044960 1201669 logs.go:282] 0 containers: []
	W1218 00:45:35.044966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:35.044975 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:35.044986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:35.059688 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:35.059704 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:35.127679 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:35.118657   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.119522   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.121561   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.122287   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:35.123640   12186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:35.127700 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:35.127711 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:35.200793 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:35.200812 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:35.229277 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:35.229293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:37.797709 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:37.807597 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:37.807657 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:37.834367 1201669 cri.go:89] found id: ""
	I1218 00:45:37.834381 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.834399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:37.834404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:37.834466 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:37.862884 1201669 cri.go:89] found id: ""
	I1218 00:45:37.862898 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.862905 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:37.862910 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:37.862967 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:37.887715 1201669 cri.go:89] found id: ""
	I1218 00:45:37.887729 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.887736 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:37.887741 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:37.887800 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:37.912412 1201669 cri.go:89] found id: ""
	I1218 00:45:37.912425 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.912432 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:37.912437 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:37.912500 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:37.936195 1201669 cri.go:89] found id: ""
	I1218 00:45:37.936209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.936216 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:37.936250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:37.936308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:37.964630 1201669 cri.go:89] found id: ""
	I1218 00:45:37.964645 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.964658 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:37.964663 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:37.964718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:37.997426 1201669 cri.go:89] found id: ""
	I1218 00:45:37.997439 1201669 logs.go:282] 0 containers: []
	W1218 00:45:37.997446 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:37.997454 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:37.997468 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:38.035686 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:38.035710 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:38.103558 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:38.103578 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:38.118520 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:38.118538 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:38.213391 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:38.200537   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.201245   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.202849   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.203411   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:38.206506   12305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:38.213399 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:38.213410 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:40.782711 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:40.792421 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:40.792487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:40.816808 1201669 cri.go:89] found id: ""
	I1218 00:45:40.816821 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.816828 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:40.816833 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:40.816889 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:40.842296 1201669 cri.go:89] found id: ""
	I1218 00:45:40.842309 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.842316 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:40.842321 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:40.842381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:40.870550 1201669 cri.go:89] found id: ""
	I1218 00:45:40.870563 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.870570 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:40.870575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:40.870631 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:40.895987 1201669 cri.go:89] found id: ""
	I1218 00:45:40.896000 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.896007 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:40.896012 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:40.896071 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:40.922196 1201669 cri.go:89] found id: ""
	I1218 00:45:40.922209 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.922217 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:40.922228 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:40.922287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:40.951012 1201669 cri.go:89] found id: ""
	I1218 00:45:40.951025 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.951032 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:40.951037 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:40.951094 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:40.975029 1201669 cri.go:89] found id: ""
	I1218 00:45:40.975043 1201669 logs.go:282] 0 containers: []
	W1218 00:45:40.975049 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:40.975057 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:40.975068 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:41.038362 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:41.030164   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.030922   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.032563   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.033156   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:41.034683   12393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:41.038371 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:41.038383 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:41.106531 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:41.106550 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:41.133380 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:41.133396 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:41.202955 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:41.202974 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.720946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:43.730523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:43.730580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:43.758480 1201669 cri.go:89] found id: ""
	I1218 00:45:43.758494 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.758501 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:43.758506 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:43.758562 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:43.782891 1201669 cri.go:89] found id: ""
	I1218 00:45:43.782904 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.782910 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:43.782915 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:43.782969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:43.807881 1201669 cri.go:89] found id: ""
	I1218 00:45:43.807895 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.807901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:43.807906 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:43.807962 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:43.831922 1201669 cri.go:89] found id: ""
	I1218 00:45:43.831934 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.831941 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:43.831946 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:43.832005 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:43.857303 1201669 cri.go:89] found id: ""
	I1218 00:45:43.857316 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.857323 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:43.857328 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:43.857385 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:43.882932 1201669 cri.go:89] found id: ""
	I1218 00:45:43.882945 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.882962 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:43.882967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:43.883034 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:43.910989 1201669 cri.go:89] found id: ""
	I1218 00:45:43.911003 1201669 logs.go:282] 0 containers: []
	W1218 00:45:43.911010 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:43.911017 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:43.911027 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:43.976855 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:43.976875 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:43.992065 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:43.992080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:44.066663 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:44.057211   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.057977   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.059774   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.060599   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:44.062293   12505 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:44.066673 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:44.066683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:44.136150 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:44.136169 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.674809 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:46.685189 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:46.685253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:46.710336 1201669 cri.go:89] found id: ""
	I1218 00:45:46.710350 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.710357 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:46.710362 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:46.710423 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:46.735331 1201669 cri.go:89] found id: ""
	I1218 00:45:46.735344 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.735351 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:46.735356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:46.735412 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:46.760113 1201669 cri.go:89] found id: ""
	I1218 00:45:46.760126 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.760133 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:46.760138 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:46.760192 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:46.785212 1201669 cri.go:89] found id: ""
	I1218 00:45:46.785225 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.785231 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:46.785237 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:46.785292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:46.810594 1201669 cri.go:89] found id: ""
	I1218 00:45:46.810607 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.810614 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:46.810619 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:46.810678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:46.835217 1201669 cri.go:89] found id: ""
	I1218 00:45:46.835231 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.835237 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:46.835242 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:46.835300 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:46.859864 1201669 cri.go:89] found id: ""
	I1218 00:45:46.859877 1201669 logs.go:282] 0 containers: []
	W1218 00:45:46.859891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:46.859899 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:46.859910 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:46.887041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:46.887057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:46.953500 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:46.953519 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:46.968086 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:46.968102 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:47.030071 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:47.022147   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.022689   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024314   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.024778   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:47.026308   12623 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:47.030081 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:47.030091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.602443 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:49.612708 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:49.612770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:49.638886 1201669 cri.go:89] found id: ""
	I1218 00:45:49.638900 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.638907 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:49.638912 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:49.638969 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:49.666118 1201669 cri.go:89] found id: ""
	I1218 00:45:49.666132 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.666139 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:49.666145 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:49.666205 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:49.695529 1201669 cri.go:89] found id: ""
	I1218 00:45:49.695542 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.695549 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:49.695554 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:49.695609 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:49.718430 1201669 cri.go:89] found id: ""
	I1218 00:45:49.718444 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.718451 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:49.718457 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:49.718514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:49.742944 1201669 cri.go:89] found id: ""
	I1218 00:45:49.742957 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.742964 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:49.742969 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:49.743028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:49.767863 1201669 cri.go:89] found id: ""
	I1218 00:45:49.767876 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.767888 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:49.767894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:49.767949 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:49.792207 1201669 cri.go:89] found id: ""
	I1218 00:45:49.792254 1201669 logs.go:282] 0 containers: []
	W1218 00:45:49.792261 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:49.792269 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:49.792279 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:49.806632 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:49.806655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:49.869094 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:49.860401   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.860953   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.862709   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.863421   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:49.864891   12712 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:49.869105 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:49.869130 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:49.936480 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:49.936498 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:49.965414 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:49.965430 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.533961 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:52.543970 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:52.544028 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:52.569650 1201669 cri.go:89] found id: ""
	I1218 00:45:52.569663 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.569671 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:52.569676 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:52.569735 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:52.593935 1201669 cri.go:89] found id: ""
	I1218 00:45:52.593949 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.593955 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:52.593961 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:52.594019 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:52.618968 1201669 cri.go:89] found id: ""
	I1218 00:45:52.618982 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.618989 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:52.618994 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:52.619051 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:52.647696 1201669 cri.go:89] found id: ""
	I1218 00:45:52.647710 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.647717 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:52.647728 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:52.647787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:52.675609 1201669 cri.go:89] found id: ""
	I1218 00:45:52.675622 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.675629 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:52.675634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:52.675690 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:52.701982 1201669 cri.go:89] found id: ""
	I1218 00:45:52.701995 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.702001 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:52.702007 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:52.702064 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:52.727053 1201669 cri.go:89] found id: ""
	I1218 00:45:52.727066 1201669 logs.go:282] 0 containers: []
	W1218 00:45:52.727073 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:52.727081 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:52.727091 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:52.793606 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:52.793626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:52.807921 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:52.807938 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:52.871908 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:52.863368   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.864337   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.865206   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.866809   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:52.867202   12820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:52.871918 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:52.871942 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:52.939995 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:52.940015 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.467573 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:55.477751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:55.477808 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:55.503215 1201669 cri.go:89] found id: ""
	I1218 00:45:55.503229 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.503235 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:55.503241 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:55.503299 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:55.528321 1201669 cri.go:89] found id: ""
	I1218 00:45:55.528334 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.528341 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:55.528346 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:55.528406 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:55.555566 1201669 cri.go:89] found id: ""
	I1218 00:45:55.555580 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.555586 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:55.555591 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:55.555659 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:55.580858 1201669 cri.go:89] found id: ""
	I1218 00:45:55.580870 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.580877 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:55.580882 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:55.580941 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:55.609703 1201669 cri.go:89] found id: ""
	I1218 00:45:55.609717 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.609724 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:55.609729 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:55.609792 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:55.635271 1201669 cri.go:89] found id: ""
	I1218 00:45:55.635285 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.635301 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:55.635307 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:55.635379 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:55.664174 1201669 cri.go:89] found id: ""
	I1218 00:45:55.664188 1201669 logs.go:282] 0 containers: []
	W1218 00:45:55.664203 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:55.664211 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:55.664247 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:55.678574 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:55.678597 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:55.741880 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:55.733391   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.733775   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.735418   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.736000   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:55.737594   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:45:55.741890 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:55.741900 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:55.814783 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:55.814804 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:55.845128 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:55.845151 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.416331 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:45:58.426299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:45:58.426355 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:45:58.457684 1201669 cri.go:89] found id: ""
	I1218 00:45:58.457698 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.457705 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:45:58.457710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:45:58.457769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:45:58.482307 1201669 cri.go:89] found id: ""
	I1218 00:45:58.482320 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.482327 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:45:58.482332 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:45:58.482389 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:45:58.507442 1201669 cri.go:89] found id: ""
	I1218 00:45:58.507454 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.507461 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:45:58.507466 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:45:58.507523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:45:58.536949 1201669 cri.go:89] found id: ""
	I1218 00:45:58.536963 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.536969 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:45:58.536974 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:45:58.537030 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:45:58.565233 1201669 cri.go:89] found id: ""
	I1218 00:45:58.565246 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.565253 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:45:58.565257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:45:58.565313 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:45:58.589568 1201669 cri.go:89] found id: ""
	I1218 00:45:58.589582 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.589589 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:45:58.589594 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:45:58.589655 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:45:58.613117 1201669 cri.go:89] found id: ""
	I1218 00:45:58.613130 1201669 logs.go:282] 0 containers: []
	W1218 00:45:58.613137 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:45:58.613145 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:45:58.613156 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:45:58.681549 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:45:58.681572 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:45:58.709658 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:45:58.709678 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:45:58.778632 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:45:58.778651 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:45:58.793209 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:45:58.793225 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:45:58.857093 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:45:58.849079   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.849652   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851134   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.851739   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:45:58.853333   13042 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.358084 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:01.368502 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:01.368561 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:01.399458 1201669 cri.go:89] found id: ""
	I1218 00:46:01.399490 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.399498 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:01.399504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:01.399589 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:01.429331 1201669 cri.go:89] found id: ""
	I1218 00:46:01.429346 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.429353 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:01.429359 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:01.429418 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:01.463764 1201669 cri.go:89] found id: ""
	I1218 00:46:01.463777 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.463784 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:01.463792 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:01.463852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:01.490438 1201669 cri.go:89] found id: ""
	I1218 00:46:01.490451 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.490458 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:01.490464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:01.490523 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:01.515150 1201669 cri.go:89] found id: ""
	I1218 00:46:01.515163 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.515170 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:01.515176 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:01.515238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:01.541480 1201669 cri.go:89] found id: ""
	I1218 00:46:01.541494 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.541501 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:01.541507 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:01.541567 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:01.566788 1201669 cri.go:89] found id: ""
	I1218 00:46:01.566802 1201669 logs.go:282] 0 containers: []
	W1218 00:46:01.566809 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:01.566817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:01.566827 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:01.630909 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:01.622550   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.623304   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.624851   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.625362   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:01.626989   13128 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:01.630919 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:01.630929 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:01.699339 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:01.699360 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:01.730198 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:01.730213 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:01.798536 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:01.798555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:04.314812 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:04.325258 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:04.325319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:04.350282 1201669 cri.go:89] found id: ""
	I1218 00:46:04.350302 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.350309 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:04.350314 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:04.350374 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:04.375290 1201669 cri.go:89] found id: ""
	I1218 00:46:04.375305 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.375311 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:04.375316 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:04.375381 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:04.410898 1201669 cri.go:89] found id: ""
	I1218 00:46:04.410911 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.410918 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:04.410923 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:04.410980 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:04.448128 1201669 cri.go:89] found id: ""
	I1218 00:46:04.448141 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.448151 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:04.448156 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:04.448214 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:04.478635 1201669 cri.go:89] found id: ""
	I1218 00:46:04.478648 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.478655 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:04.478660 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:04.478718 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:04.504261 1201669 cri.go:89] found id: ""
	I1218 00:46:04.504275 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.504282 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:04.504288 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:04.504345 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:04.529823 1201669 cri.go:89] found id: ""
	I1218 00:46:04.529836 1201669 logs.go:282] 0 containers: []
	W1218 00:46:04.529843 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:04.529851 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:04.529862 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:04.595056 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:04.587112   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.587762   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589353   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.589778   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:04.591223   13230 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:04.595066 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:04.595076 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:04.665580 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:04.665600 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:04.695540 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:04.695555 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:04.766700 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:04.766721 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.281438 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:07.291184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:07.291241 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:07.318270 1201669 cri.go:89] found id: ""
	I1218 00:46:07.318283 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.318290 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:07.318295 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:07.318353 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:07.342684 1201669 cri.go:89] found id: ""
	I1218 00:46:07.342697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.342704 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:07.342718 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:07.342777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:07.367159 1201669 cri.go:89] found id: ""
	I1218 00:46:07.367173 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.367180 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:07.367186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:07.367252 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:07.399917 1201669 cri.go:89] found id: ""
	I1218 00:46:07.399942 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.399949 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:07.399954 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:07.400025 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:07.428891 1201669 cri.go:89] found id: ""
	I1218 00:46:07.428904 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.428911 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:07.428918 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:07.428988 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:07.461232 1201669 cri.go:89] found id: ""
	I1218 00:46:07.461244 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.461251 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:07.461257 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:07.461319 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:07.487577 1201669 cri.go:89] found id: ""
	I1218 00:46:07.487590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:07.487607 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:07.487616 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:07.487626 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:07.554637 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:07.554656 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:07.570064 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:07.570080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:07.635097 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:07.627057   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.627642   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629308   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.629740   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:07.631233   13343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:07.635107 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:07.635118 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:07.706762 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:07.706782 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.235305 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:10.245498 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:10.245568 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:10.275954 1201669 cri.go:89] found id: ""
	I1218 00:46:10.275965 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.275972 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:10.275985 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:10.276042 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:10.301377 1201669 cri.go:89] found id: ""
	I1218 00:46:10.301391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.301397 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:10.301402 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:10.301468 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:10.327075 1201669 cri.go:89] found id: ""
	I1218 00:46:10.327089 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.327096 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:10.327101 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:10.327163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:10.355039 1201669 cri.go:89] found id: ""
	I1218 00:46:10.355052 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.355059 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:10.355064 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:10.355126 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:10.380800 1201669 cri.go:89] found id: ""
	I1218 00:46:10.380814 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.380821 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:10.380826 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:10.380883 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:10.420766 1201669 cri.go:89] found id: ""
	I1218 00:46:10.420781 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.420788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:10.420794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:10.420852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:10.450993 1201669 cri.go:89] found id: ""
	I1218 00:46:10.451006 1201669 logs.go:282] 0 containers: []
	W1218 00:46:10.451013 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:10.451021 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:10.451031 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:10.469649 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:10.469664 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:10.534853 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:10.526326   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527133   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.527959   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.529531   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:10.530066   13446 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:10.534862 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:10.534873 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:10.603061 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:10.603080 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:10.634944 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:10.634961 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.201986 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:13.212552 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:13.212611 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:13.236454 1201669 cri.go:89] found id: ""
	I1218 00:46:13.236468 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.236475 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:13.236481 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:13.236542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:13.261394 1201669 cri.go:89] found id: ""
	I1218 00:46:13.261408 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.261415 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:13.261420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:13.261479 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:13.286366 1201669 cri.go:89] found id: ""
	I1218 00:46:13.286380 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.286393 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:13.286398 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:13.286457 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:13.311045 1201669 cri.go:89] found id: ""
	I1218 00:46:13.311058 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.311065 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:13.311070 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:13.311132 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:13.336414 1201669 cri.go:89] found id: ""
	I1218 00:46:13.336427 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.336434 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:13.336439 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:13.336503 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:13.366089 1201669 cri.go:89] found id: ""
	I1218 00:46:13.366102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.366109 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:13.366114 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:13.366170 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:13.398167 1201669 cri.go:89] found id: ""
	I1218 00:46:13.398180 1201669 logs.go:282] 0 containers: []
	W1218 00:46:13.398187 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:13.398195 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:13.398205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:13.472148 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:13.472173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:13.487248 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:13.487267 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:13.552950 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:13.544025   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.544746   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546313   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.546830   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:13.548418   13551 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:13.552960 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:13.552973 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:13.622039 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:13.622058 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:16.149384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:16.159725 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:16.159786 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:16.186967 1201669 cri.go:89] found id: ""
	I1218 00:46:16.186981 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.186988 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:16.186993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:16.187052 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:16.213347 1201669 cri.go:89] found id: ""
	I1218 00:46:16.213361 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.213368 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:16.213374 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:16.213431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:16.239666 1201669 cri.go:89] found id: ""
	I1218 00:46:16.239679 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.239686 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:16.239692 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:16.239747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:16.264667 1201669 cri.go:89] found id: ""
	I1218 00:46:16.264680 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.264686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:16.264691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:16.264747 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:16.290913 1201669 cri.go:89] found id: ""
	I1218 00:46:16.290925 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.290932 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:16.290937 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:16.290995 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:16.318436 1201669 cri.go:89] found id: ""
	I1218 00:46:16.318449 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.318458 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:16.318464 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:16.318522 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:16.344303 1201669 cri.go:89] found id: ""
	I1218 00:46:16.344316 1201669 logs.go:282] 0 containers: []
	W1218 00:46:16.344323 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:16.344331 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:16.344342 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:16.411796 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:16.411814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:16.427899 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:16.427916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:16.499022 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:16.490316   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.490884   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.492692   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.493326   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:16.495095   13659 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:16.499032 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:16.499042 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:16.568931 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:16.568951 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.102749 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:19.112504 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:19.112560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:19.140375 1201669 cri.go:89] found id: ""
	I1218 00:46:19.140389 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.140396 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:19.140401 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:19.140462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:19.170806 1201669 cri.go:89] found id: ""
	I1218 00:46:19.170832 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.170840 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:19.170848 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:19.170930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:19.202879 1201669 cri.go:89] found id: ""
	I1218 00:46:19.202894 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.202901 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:19.202907 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:19.202973 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:19.226832 1201669 cri.go:89] found id: ""
	I1218 00:46:19.226844 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.226851 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:19.226856 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:19.226913 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:19.251251 1201669 cri.go:89] found id: ""
	I1218 00:46:19.251264 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.251271 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:19.251277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:19.251334 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:19.275051 1201669 cri.go:89] found id: ""
	I1218 00:46:19.275064 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.275071 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:19.275080 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:19.275138 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:19.303255 1201669 cri.go:89] found id: ""
	I1218 00:46:19.303268 1201669 logs.go:282] 0 containers: []
	W1218 00:46:19.303291 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:19.303299 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:19.303309 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:19.332819 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:19.332836 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:19.398262 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:19.398281 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:19.413015 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:19.413030 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:19.483412 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:19.474680   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.475404   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477072   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.477642   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:19.479329   13778 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:19.483423 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:19.483475 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:22.052118 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:22.062390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:22.062454 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:22.091903 1201669 cri.go:89] found id: ""
	I1218 00:46:22.091917 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.091924 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:22.091930 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:22.091987 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:22.116458 1201669 cri.go:89] found id: ""
	I1218 00:46:22.116471 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.116478 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:22.116483 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:22.116560 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:22.142090 1201669 cri.go:89] found id: ""
	I1218 00:46:22.142102 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.142109 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:22.142115 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:22.142180 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:22.166148 1201669 cri.go:89] found id: ""
	I1218 00:46:22.166162 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.166169 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:22.166175 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:22.166234 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:22.191864 1201669 cri.go:89] found id: ""
	I1218 00:46:22.191877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.191884 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:22.191890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:22.191953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:22.216176 1201669 cri.go:89] found id: ""
	I1218 00:46:22.216190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.216197 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:22.216202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:22.216283 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:22.240865 1201669 cri.go:89] found id: ""
	I1218 00:46:22.240878 1201669 logs.go:282] 0 containers: []
	W1218 00:46:22.240891 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:22.240898 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:22.240908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:22.269665 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:22.269688 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:22.334885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:22.334903 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:22.349240 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:22.349256 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:22.424972 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:22.415022   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.416043   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417651   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.417971   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:22.420766   13878 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:22.424982 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:22.425001 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.004463 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:25.015873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:25.015934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:25.043533 1201669 cri.go:89] found id: ""
	I1218 00:46:25.043547 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.043558 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:25.043563 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:25.043630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:25.070860 1201669 cri.go:89] found id: ""
	I1218 00:46:25.070874 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.070881 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:25.070887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:25.070945 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:25.100326 1201669 cri.go:89] found id: ""
	I1218 00:46:25.100340 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.100349 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:25.100356 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:25.100420 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:25.127292 1201669 cri.go:89] found id: ""
	I1218 00:46:25.127306 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.127313 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:25.127318 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:25.127376 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:25.152929 1201669 cri.go:89] found id: ""
	I1218 00:46:25.152943 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.152950 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:25.152955 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:25.153023 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:25.179602 1201669 cri.go:89] found id: ""
	I1218 00:46:25.179622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.179629 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:25.179634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:25.179691 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:25.204777 1201669 cri.go:89] found id: ""
	I1218 00:46:25.204790 1201669 logs.go:282] 0 containers: []
	W1218 00:46:25.204797 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:25.204804 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:25.204814 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:25.274359 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:25.274379 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:25.305207 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:25.305224 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:25.375922 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:25.375941 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:25.392181 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:25.392196 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:25.470714 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:25.462167   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.462798   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464462   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.464968   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:25.466610   13992 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:27.970992 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:27.980972 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:27.981029 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:28.007728 1201669 cri.go:89] found id: ""
	I1218 00:46:28.007744 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.007752 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:28.007758 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:28.007821 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:28.038973 1201669 cri.go:89] found id: ""
	I1218 00:46:28.038987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.038995 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:28.039000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:28.039063 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:28.066609 1201669 cri.go:89] found id: ""
	I1218 00:46:28.066622 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.066629 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:28.066634 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:28.066695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:28.092484 1201669 cri.go:89] found id: ""
	I1218 00:46:28.092498 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.092506 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:28.092512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:28.092583 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:28.119611 1201669 cri.go:89] found id: ""
	I1218 00:46:28.119625 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.119632 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:28.119638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:28.119698 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:28.145154 1201669 cri.go:89] found id: ""
	I1218 00:46:28.145167 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.145175 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:28.145180 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:28.145238 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:28.170178 1201669 cri.go:89] found id: ""
	I1218 00:46:28.170191 1201669 logs.go:282] 0 containers: []
	W1218 00:46:28.170198 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:28.170206 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:28.170216 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:28.235805 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:28.235824 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:28.250608 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:28.250629 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:28.314678 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:28.307119   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.307553   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.308984   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.309302   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:28.310693   14079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:28.314687 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:28.314698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:28.383399 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:28.383420 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:30.924810 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:30.935068 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:30.935128 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:30.960550 1201669 cri.go:89] found id: ""
	I1218 00:46:30.960563 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.960570 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:30.960575 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:30.960636 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:30.985705 1201669 cri.go:89] found id: ""
	I1218 00:46:30.985718 1201669 logs.go:282] 0 containers: []
	W1218 00:46:30.985725 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:30.985730 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:30.985787 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:31.011725 1201669 cri.go:89] found id: ""
	I1218 00:46:31.011739 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.011746 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:31.011751 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:31.011813 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:31.038735 1201669 cri.go:89] found id: ""
	I1218 00:46:31.038748 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.038755 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:31.038760 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:31.038822 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:31.062623 1201669 cri.go:89] found id: ""
	I1218 00:46:31.062637 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.062645 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:31.062651 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:31.062716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:31.089339 1201669 cri.go:89] found id: ""
	I1218 00:46:31.089353 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.089366 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:31.089372 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:31.089431 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:31.119659 1201669 cri.go:89] found id: ""
	I1218 00:46:31.119672 1201669 logs.go:282] 0 containers: []
	W1218 00:46:31.119679 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:31.119687 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:31.119698 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:31.185677 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:31.185697 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:31.200077 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:31.200092 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:31.263573 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:31.255368   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.256074   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.257658   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.258160   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:31.259792   14186 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:31.263582 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:31.263593 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:31.331836 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:31.331857 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:33.859870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:33.871250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:33.871309 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:33.899076 1201669 cri.go:89] found id: ""
	I1218 00:46:33.899090 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.899097 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:33.899103 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:33.899163 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:33.927937 1201669 cri.go:89] found id: ""
	I1218 00:46:33.927955 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.927961 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:33.927967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:33.928024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:33.954257 1201669 cri.go:89] found id: ""
	I1218 00:46:33.954271 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.954278 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:33.954283 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:33.954339 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:33.978840 1201669 cri.go:89] found id: ""
	I1218 00:46:33.978853 1201669 logs.go:282] 0 containers: []
	W1218 00:46:33.978860 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:33.978865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:33.978921 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:34.008172 1201669 cri.go:89] found id: ""
	I1218 00:46:34.008186 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.008193 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:34.008198 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:34.008296 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:34.038029 1201669 cri.go:89] found id: ""
	I1218 00:46:34.038043 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.038050 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:34.038057 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:34.038116 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:34.067280 1201669 cri.go:89] found id: ""
	I1218 00:46:34.067294 1201669 logs.go:282] 0 containers: []
	W1218 00:46:34.067302 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:34.067311 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:34.067321 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:34.099533 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:34.099549 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:34.165421 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:34.165442 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:34.179966 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:34.179981 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:34.243670 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:34.235234   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.236061   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.237756   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.238073   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:34.239589   14297 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:34.243681 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:34.243694 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:36.812424 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:36.822427 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:36.822486 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:36.847846 1201669 cri.go:89] found id: ""
	I1218 00:46:36.847859 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.847866 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:36.847872 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:36.847927 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:36.873323 1201669 cri.go:89] found id: ""
	I1218 00:46:36.873337 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.873344 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:36.873349 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:36.873408 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:36.898528 1201669 cri.go:89] found id: ""
	I1218 00:46:36.898541 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.898547 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:36.898553 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:36.898608 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:36.925176 1201669 cri.go:89] found id: ""
	I1218 00:46:36.925190 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.925197 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:36.925202 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:36.925260 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:36.954449 1201669 cri.go:89] found id: ""
	I1218 00:46:36.954463 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.954469 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:36.954474 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:36.954533 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:36.978226 1201669 cri.go:89] found id: ""
	I1218 00:46:36.978239 1201669 logs.go:282] 0 containers: []
	W1218 00:46:36.978246 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:36.978251 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:36.978308 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:37.005731 1201669 cri.go:89] found id: ""
	I1218 00:46:37.005747 1201669 logs.go:282] 0 containers: []
	W1218 00:46:37.005755 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:37.005764 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:37.005776 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:37.026584 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:37.026606 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:37.089657 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:37.081537   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.082271   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.083936   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.084492   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:37.086007   14389 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:37.089672 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:37.089683 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:37.161954 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:37.161980 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:37.189136 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:37.189155 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:39.765929 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:39.776452 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:39.776510 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:39.801519 1201669 cri.go:89] found id: ""
	I1218 00:46:39.801532 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.801539 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:39.801544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:39.801604 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:39.829201 1201669 cri.go:89] found id: ""
	I1218 00:46:39.829215 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.829222 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:39.829226 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:39.829287 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:39.854274 1201669 cri.go:89] found id: ""
	I1218 00:46:39.854287 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.854294 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:39.854299 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:39.854357 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:39.879811 1201669 cri.go:89] found id: ""
	I1218 00:46:39.879824 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.879831 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:39.879836 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:39.879893 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:39.912296 1201669 cri.go:89] found id: ""
	I1218 00:46:39.912310 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.912317 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:39.912322 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:39.912380 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:39.939288 1201669 cri.go:89] found id: ""
	I1218 00:46:39.939313 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.939321 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:39.939326 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:39.939393 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:39.967012 1201669 cri.go:89] found id: ""
	I1218 00:46:39.967027 1201669 logs.go:282] 0 containers: []
	W1218 00:46:39.967034 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:39.967041 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:39.967051 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:40.033896 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:40.033919 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:40.052546 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:40.052564 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:40.123489 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:40.114673   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.115138   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.116907   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.117543   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:40.119178   14499 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:40.123524 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:40.123537 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:40.195140 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:40.195161 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:42.731664 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:42.741511 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:42.741573 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:42.765856 1201669 cri.go:89] found id: ""
	I1218 00:46:42.765869 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.765876 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:42.765881 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:42.765947 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:42.790000 1201669 cri.go:89] found id: ""
	I1218 00:46:42.790013 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.790020 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:42.790025 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:42.790080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:42.814497 1201669 cri.go:89] found id: ""
	I1218 00:46:42.814511 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.814518 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:42.814523 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:42.814580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:42.839923 1201669 cri.go:89] found id: ""
	I1218 00:46:42.839937 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.839943 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:42.839948 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:42.840009 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:42.866771 1201669 cri.go:89] found id: ""
	I1218 00:46:42.866784 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.866791 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:42.866798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:42.866856 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:42.894391 1201669 cri.go:89] found id: ""
	I1218 00:46:42.894404 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.894411 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:42.894416 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:42.894481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:42.919369 1201669 cri.go:89] found id: ""
	I1218 00:46:42.919391 1201669 logs.go:282] 0 containers: []
	W1218 00:46:42.919399 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:42.919408 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:42.919419 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:42.934812 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:42.934829 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:42.998153 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:42.989569   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.989983   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.991588   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.992181   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:42.993786   14605 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:42.998162 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:42.998173 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:43.067475 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:43.067494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:43.097319 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:43.097335 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:45.664349 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:45.675110 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:45.675171 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:45.707428 1201669 cri.go:89] found id: ""
	I1218 00:46:45.707442 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.707449 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:45.707454 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:45.707512 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:45.732673 1201669 cri.go:89] found id: ""
	I1218 00:46:45.732687 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.732694 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:45.732700 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:45.732759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:45.756652 1201669 cri.go:89] found id: ""
	I1218 00:46:45.756666 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.756673 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:45.756679 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:45.756741 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:45.781416 1201669 cri.go:89] found id: ""
	I1218 00:46:45.781430 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.781437 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:45.781442 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:45.781498 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:45.806268 1201669 cri.go:89] found id: ""
	I1218 00:46:45.806281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.806288 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:45.806294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:45.806363 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:45.831015 1201669 cri.go:89] found id: ""
	I1218 00:46:45.831028 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.831035 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:45.831040 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:45.831098 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:45.855951 1201669 cri.go:89] found id: ""
	I1218 00:46:45.855964 1201669 logs.go:282] 0 containers: []
	W1218 00:46:45.855970 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:45.855978 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:45.855988 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:45.870419 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:45.870436 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:45.934620 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:45.926005   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.926808   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.928556   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.929076   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:45.930752   14715 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:45.934630 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:45.934641 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:46.007377 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:46.007400 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:46.038285 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:46.038302 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.604685 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:48.614701 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:48.614759 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:48.640971 1201669 cri.go:89] found id: ""
	I1218 00:46:48.640984 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.640991 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:48.640997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:48.641055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:48.670241 1201669 cri.go:89] found id: ""
	I1218 00:46:48.670254 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.670261 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:48.670266 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:48.670324 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:48.714267 1201669 cri.go:89] found id: ""
	I1218 00:46:48.714281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.714288 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:48.714294 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:48.714359 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:48.743058 1201669 cri.go:89] found id: ""
	I1218 00:46:48.743071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.743077 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:48.743083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:48.743146 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:48.768865 1201669 cri.go:89] found id: ""
	I1218 00:46:48.768877 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.768885 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:48.768890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:48.768950 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:48.794057 1201669 cri.go:89] found id: ""
	I1218 00:46:48.794071 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.794078 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:48.794083 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:48.794139 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:48.824069 1201669 cri.go:89] found id: ""
	I1218 00:46:48.824082 1201669 logs.go:282] 0 containers: []
	W1218 00:46:48.824090 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:48.824102 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:48.824112 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:48.893155 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:48.893176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:48.908605 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:48.908621 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:48.974531 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:48.966647   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.967414   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.968909   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.969380   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:48.970840   14822 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:48.974541 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:48.974551 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:49.047912 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:49.047931 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.578760 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:51.588638 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:51.588697 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:51.620629 1201669 cri.go:89] found id: ""
	I1218 00:46:51.620643 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.620649 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:51.620661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:51.620737 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:51.653267 1201669 cri.go:89] found id: ""
	I1218 00:46:51.653281 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.653297 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:51.653302 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:51.653372 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:51.680215 1201669 cri.go:89] found id: ""
	I1218 00:46:51.680250 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.680257 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:51.680263 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:51.680328 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:51.712435 1201669 cri.go:89] found id: ""
	I1218 00:46:51.712448 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.712455 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:51.712460 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:51.712525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:51.740973 1201669 cri.go:89] found id: ""
	I1218 00:46:51.740987 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.740994 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:51.741000 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:51.741057 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:51.765683 1201669 cri.go:89] found id: ""
	I1218 00:46:51.765697 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.765704 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:51.765710 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:51.765767 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:51.793065 1201669 cri.go:89] found id: ""
	I1218 00:46:51.793080 1201669 logs.go:282] 0 containers: []
	W1218 00:46:51.793088 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:51.793095 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:51.793106 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:51.807847 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:51.807863 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:51.870944 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:51.862356   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.863004   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.864744   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.865376   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:51.866957   14922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:51.870953 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:51.870964 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:51.939037 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:51.939057 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:51.973517 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:51.973532 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:54.540109 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:54.550150 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:54.550216 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:54.579006 1201669 cri.go:89] found id: ""
	I1218 00:46:54.579019 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.579026 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:54.579031 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:54.579088 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:54.609045 1201669 cri.go:89] found id: ""
	I1218 00:46:54.609059 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.609066 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:54.609071 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:54.609130 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:54.640693 1201669 cri.go:89] found id: ""
	I1218 00:46:54.640707 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.640714 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:54.640720 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:54.640777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:54.674577 1201669 cri.go:89] found id: ""
	I1218 00:46:54.674590 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.674597 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:54.674603 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:54.674658 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:54.709862 1201669 cri.go:89] found id: ""
	I1218 00:46:54.709875 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.709882 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:54.709887 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:54.709946 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:54.735151 1201669 cri.go:89] found id: ""
	I1218 00:46:54.735165 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.735171 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:54.735177 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:54.735237 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:54.762946 1201669 cri.go:89] found id: ""
	I1218 00:46:54.762960 1201669 logs.go:282] 0 containers: []
	W1218 00:46:54.762966 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:54.762974 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:54.762984 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:54.778250 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:54.778266 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:54.841698 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:54.833513   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.833954   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.835582   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.836177   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:54.837811   15025 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:54.841707 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:54.841718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:54.909164 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:54.909183 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:46:54.946219 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:54.946236 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.515189 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:46:57.525323 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:46:57.525384 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:46:57.550694 1201669 cri.go:89] found id: ""
	I1218 00:46:57.550708 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.550716 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:46:57.550721 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:46:57.550782 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:46:57.578567 1201669 cri.go:89] found id: ""
	I1218 00:46:57.578582 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.578590 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:46:57.578595 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:46:57.578656 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:46:57.604092 1201669 cri.go:89] found id: ""
	I1218 00:46:57.604105 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.604112 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:46:57.604120 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:46:57.604178 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:46:57.628719 1201669 cri.go:89] found id: ""
	I1218 00:46:57.628733 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.628739 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:46:57.628744 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:46:57.628806 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:46:57.666872 1201669 cri.go:89] found id: ""
	I1218 00:46:57.666885 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.666892 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:46:57.666897 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:46:57.666954 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:46:57.703636 1201669 cri.go:89] found id: ""
	I1218 00:46:57.703649 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.703656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:46:57.703661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:46:57.703721 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:46:57.729878 1201669 cri.go:89] found id: ""
	I1218 00:46:57.729891 1201669 logs.go:282] 0 containers: []
	W1218 00:46:57.729898 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:46:57.729905 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:46:57.729916 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:46:57.793892 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:46:57.793911 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:46:57.808664 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:46:57.808680 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:46:57.871552 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:46:57.863886   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.864342   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866011   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.866458   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:46:57.868103   15132 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:46:57.871570 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:46:57.871582 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:46:57.939629 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:46:57.939649 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:00.470791 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:00.480890 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:00.480955 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:00.510265 1201669 cri.go:89] found id: ""
	I1218 00:47:00.510278 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.510285 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:00.510290 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:00.510349 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:00.534908 1201669 cri.go:89] found id: ""
	I1218 00:47:00.534922 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.534929 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:00.534934 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:00.534992 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:00.559619 1201669 cri.go:89] found id: ""
	I1218 00:47:00.559632 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.559639 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:00.559644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:00.559705 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:00.587698 1201669 cri.go:89] found id: ""
	I1218 00:47:00.587711 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.587719 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:00.587724 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:00.587781 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:00.611884 1201669 cri.go:89] found id: ""
	I1218 00:47:00.611897 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.611904 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:00.611909 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:00.611974 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:00.640874 1201669 cri.go:89] found id: ""
	I1218 00:47:00.640888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.640895 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:00.640900 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:00.640965 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:00.674185 1201669 cri.go:89] found id: ""
	I1218 00:47:00.674198 1201669 logs.go:282] 0 containers: []
	W1218 00:47:00.674205 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:00.674213 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:00.674223 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:00.750327 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:00.750347 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:00.765877 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:00.765899 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:00.831441 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:00.822958   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.823674   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.825414   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.826032   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:00.827654   15235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:00.831450 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:00.831462 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:00.899398 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:00.899423 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:03.427398 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:03.437572 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:03.437634 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:03.466927 1201669 cri.go:89] found id: ""
	I1218 00:47:03.466940 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.466948 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:03.466952 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:03.467011 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:03.492647 1201669 cri.go:89] found id: ""
	I1218 00:47:03.492661 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.492668 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:03.492672 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:03.492729 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:03.522689 1201669 cri.go:89] found id: ""
	I1218 00:47:03.522702 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.522709 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:03.522714 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:03.522774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:03.547665 1201669 cri.go:89] found id: ""
	I1218 00:47:03.547679 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.547686 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:03.547691 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:03.547754 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:03.573125 1201669 cri.go:89] found id: ""
	I1218 00:47:03.573139 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.573146 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:03.573151 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:03.573209 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:03.598799 1201669 cri.go:89] found id: ""
	I1218 00:47:03.598812 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.598819 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:03.598825 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:03.598882 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:03.622999 1201669 cri.go:89] found id: ""
	I1218 00:47:03.623013 1201669 logs.go:282] 0 containers: []
	W1218 00:47:03.623019 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:03.623027 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:03.623037 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:03.697686 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:03.697703 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:03.715817 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:03.715833 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:03.782593 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:03.774146   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.774698   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.776429   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.777016   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:03.778645   15343 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:03.782603 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:03.782616 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:03.850592 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:03.850611 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:06.381230 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:06.390993 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:06.391053 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:06.414602 1201669 cri.go:89] found id: ""
	I1218 00:47:06.414616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.414622 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:06.414628 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:06.414684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:06.438729 1201669 cri.go:89] found id: ""
	I1218 00:47:06.438743 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.438750 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:06.438755 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:06.438820 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:06.463196 1201669 cri.go:89] found id: ""
	I1218 00:47:06.463208 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.463215 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:06.463220 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:06.463275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:06.488161 1201669 cri.go:89] found id: ""
	I1218 00:47:06.488174 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.488181 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:06.488186 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:06.488275 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:06.517546 1201669 cri.go:89] found id: ""
	I1218 00:47:06.517559 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.517566 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:06.517571 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:06.517630 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:06.541811 1201669 cri.go:89] found id: ""
	I1218 00:47:06.541825 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.541831 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:06.541837 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:06.541894 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:06.565470 1201669 cri.go:89] found id: ""
	I1218 00:47:06.565483 1201669 logs.go:282] 0 containers: []
	W1218 00:47:06.565491 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:06.565501 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:06.565511 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:06.630810 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:06.630828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:06.650036 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:06.650061 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:06.735359 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:06.725922   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.726879   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.727744   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.728843   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:06.729522   15450 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:06.735369 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:06.735382 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:06.804427 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:06.804447 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.337315 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:09.347711 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:09.347770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:09.373796 1201669 cri.go:89] found id: ""
	I1218 00:47:09.373809 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.373817 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:09.373823 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:09.373887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:09.398745 1201669 cri.go:89] found id: ""
	I1218 00:47:09.398759 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.398766 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:09.398783 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:09.398850 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:09.424602 1201669 cri.go:89] found id: ""
	I1218 00:47:09.424616 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.424623 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:09.424630 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:09.424687 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:09.453853 1201669 cri.go:89] found id: ""
	I1218 00:47:09.453866 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.453873 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:09.453879 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:09.453934 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:09.482334 1201669 cri.go:89] found id: ""
	I1218 00:47:09.482348 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.482355 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:09.482360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:09.482415 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:09.514905 1201669 cri.go:89] found id: ""
	I1218 00:47:09.514928 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.514935 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:09.514941 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:09.515006 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:09.538866 1201669 cri.go:89] found id: ""
	I1218 00:47:09.538888 1201669 logs.go:282] 0 containers: []
	W1218 00:47:09.538895 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:09.538903 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:09.538913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:09.553496 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:09.553516 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:09.615452 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:09.607144   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.607889   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.609597   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.610126   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:09.611749   15546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:09.615461 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:09.615472 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:09.683616 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:09.683638 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:09.715893 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:09.715908 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.282722 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:12.292327 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:12.292388 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:12.317025 1201669 cri.go:89] found id: ""
	I1218 00:47:12.317039 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.317045 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:12.317050 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:12.317106 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:12.341477 1201669 cri.go:89] found id: ""
	I1218 00:47:12.341490 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.341497 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:12.341501 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:12.341556 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:12.365784 1201669 cri.go:89] found id: ""
	I1218 00:47:12.365798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.365805 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:12.365810 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:12.365870 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:12.394874 1201669 cri.go:89] found id: ""
	I1218 00:47:12.394887 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.394894 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:12.394899 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:12.394958 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:12.419496 1201669 cri.go:89] found id: ""
	I1218 00:47:12.419509 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.419516 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:12.419521 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:12.419577 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:12.444379 1201669 cri.go:89] found id: ""
	I1218 00:47:12.444393 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.444399 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:12.444414 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:12.444470 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:12.468918 1201669 cri.go:89] found id: ""
	I1218 00:47:12.468931 1201669 logs.go:282] 0 containers: []
	W1218 00:47:12.468939 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:12.468946 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:12.468960 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:12.537486 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:12.537505 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:12.568974 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:12.568990 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:12.635070 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:12.635089 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:12.652372 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:12.652388 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:12.728630 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:12.720011   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.720845   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.722510   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.723077   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:12.724772   15671 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:15.228895 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:15.239250 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:15.239307 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:15.264983 1201669 cri.go:89] found id: ""
	I1218 00:47:15.264996 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.265003 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:15.265009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:15.265070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:15.293517 1201669 cri.go:89] found id: ""
	I1218 00:47:15.293531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.293537 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:15.293542 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:15.293599 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:15.319218 1201669 cri.go:89] found id: ""
	I1218 00:47:15.319231 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.319238 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:15.319243 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:15.319298 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:15.344396 1201669 cri.go:89] found id: ""
	I1218 00:47:15.344410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.344417 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:15.344422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:15.344481 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:15.373243 1201669 cri.go:89] found id: ""
	I1218 00:47:15.373256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.373263 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:15.373268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:15.373329 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:15.397807 1201669 cri.go:89] found id: ""
	I1218 00:47:15.397820 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.397827 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:15.397832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:15.397887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:15.422535 1201669 cri.go:89] found id: ""
	I1218 00:47:15.422549 1201669 logs.go:282] 0 containers: []
	W1218 00:47:15.422557 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:15.422564 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:15.422574 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:15.490575 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:15.490595 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:15.521157 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:15.521176 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:15.592728 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:15.592747 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:15.607949 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:15.607965 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:15.688565 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:15.679821   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.680637   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682313   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.682620   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:15.684705   15771 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.190283 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:18.200009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:18.200073 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:18.224427 1201669 cri.go:89] found id: ""
	I1218 00:47:18.224440 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.224447 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:18.224453 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:18.224514 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:18.248627 1201669 cri.go:89] found id: ""
	I1218 00:47:18.248641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.248648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:18.248653 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:18.248711 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:18.275672 1201669 cri.go:89] found id: ""
	I1218 00:47:18.275690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.275703 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:18.275709 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:18.275766 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:18.302626 1201669 cri.go:89] found id: ""
	I1218 00:47:18.302640 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.302656 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:18.302661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:18.302716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:18.328772 1201669 cri.go:89] found id: ""
	I1218 00:47:18.328785 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.328792 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:18.328797 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:18.328852 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:18.354242 1201669 cri.go:89] found id: ""
	I1218 00:47:18.354256 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.354263 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:18.354268 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:18.354332 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:18.378135 1201669 cri.go:89] found id: ""
	I1218 00:47:18.378148 1201669 logs.go:282] 0 containers: []
	W1218 00:47:18.378157 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:18.378165 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:18.378175 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:18.443885 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:18.443904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:18.458116 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:18.458135 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:18.520486 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:18.512317   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.513076   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.514591   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.515130   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:18.516782   15863 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:18.520496 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:18.520507 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:18.586967 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:18.586986 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:21.118235 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:21.128015 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:21.128072 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:21.153715 1201669 cri.go:89] found id: ""
	I1218 00:47:21.153729 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.153736 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:21.153742 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:21.153803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:21.183062 1201669 cri.go:89] found id: ""
	I1218 00:47:21.183075 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.183082 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:21.183087 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:21.183144 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:21.210382 1201669 cri.go:89] found id: ""
	I1218 00:47:21.210396 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.210402 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:21.210407 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:21.210462 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:21.235561 1201669 cri.go:89] found id: ""
	I1218 00:47:21.235575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.235582 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:21.235587 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:21.235684 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:21.261486 1201669 cri.go:89] found id: ""
	I1218 00:47:21.261500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.261507 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:21.261512 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:21.261571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:21.286687 1201669 cri.go:89] found id: ""
	I1218 00:47:21.286701 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.286708 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:21.286713 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:21.286770 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:21.312639 1201669 cri.go:89] found id: ""
	I1218 00:47:21.312656 1201669 logs.go:282] 0 containers: []
	W1218 00:47:21.312663 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:21.312671 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:21.312682 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:21.377475 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:21.377494 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:21.394148 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:21.394166 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:21.461525 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:21.452467   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.453950   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.454852   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.456508   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:21.457049   15966 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:21.461535 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:21.461546 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:21.529823 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:21.529841 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.060601 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:24.071009 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:24.071080 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:24.098379 1201669 cri.go:89] found id: ""
	I1218 00:47:24.098392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.098399 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:24.098406 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:24.098520 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:24.125402 1201669 cri.go:89] found id: ""
	I1218 00:47:24.125416 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.125423 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:24.125428 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:24.125487 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:24.151397 1201669 cri.go:89] found id: ""
	I1218 00:47:24.151410 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.151417 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:24.151422 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:24.151485 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:24.178459 1201669 cri.go:89] found id: ""
	I1218 00:47:24.178473 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.178480 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:24.178485 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:24.178542 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:24.204162 1201669 cri.go:89] found id: ""
	I1218 00:47:24.204175 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.204182 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:24.204188 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:24.204282 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:24.232955 1201669 cri.go:89] found id: ""
	I1218 00:47:24.232969 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.232977 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:24.232982 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:24.233043 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:24.258828 1201669 cri.go:89] found id: ""
	I1218 00:47:24.258841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:24.258848 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:24.258856 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:24.258867 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:24.285593 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:24.285609 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:24.352328 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:24.352348 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:24.367078 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:24.367095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:24.430867 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:24.422156   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.422897   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.424622   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.425151   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:24.426618   16084 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:24.430877 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:24.430887 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.002647 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:27.013860 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:27.013930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:27.042334 1201669 cri.go:89] found id: ""
	I1218 00:47:27.042347 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.042354 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:27.042360 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:27.042419 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:27.066697 1201669 cri.go:89] found id: ""
	I1218 00:47:27.066710 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.066717 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:27.066722 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:27.066777 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:27.094998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.095011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.095018 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:27.095024 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:27.095081 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:27.122504 1201669 cri.go:89] found id: ""
	I1218 00:47:27.122518 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.122525 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:27.122530 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:27.122587 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:27.147998 1201669 cri.go:89] found id: ""
	I1218 00:47:27.148011 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.148018 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:27.148023 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:27.148093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:27.172132 1201669 cri.go:89] found id: ""
	I1218 00:47:27.172149 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.172156 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:27.172161 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:27.172253 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:27.197418 1201669 cri.go:89] found id: ""
	I1218 00:47:27.197431 1201669 logs.go:282] 0 containers: []
	W1218 00:47:27.197438 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:27.197445 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:27.197455 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:27.263570 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:27.263588 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:27.278312 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:27.278327 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:27.342448 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:27.333583   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.334359   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336203   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.336926   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:27.338518   16175 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:27.342458 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:27.342469 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:27.410881 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:27.410901 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:29.944358 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:29.954644 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:29.954701 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:29.978633 1201669 cri.go:89] found id: ""
	I1218 00:47:29.978647 1201669 logs.go:282] 0 containers: []
	W1218 00:47:29.978654 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:29.978659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:29.978717 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:30.009832 1201669 cri.go:89] found id: ""
	I1218 00:47:30.009850 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.009858 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:30.009864 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:30.009938 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:30.040840 1201669 cri.go:89] found id: ""
	I1218 00:47:30.040858 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.040867 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:30.040876 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:30.040952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:30.068318 1201669 cri.go:89] found id: ""
	I1218 00:47:30.068332 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.068339 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:30.068344 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:30.068407 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:30.094562 1201669 cri.go:89] found id: ""
	I1218 00:47:30.094577 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.094584 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:30.094589 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:30.094650 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:30.121388 1201669 cri.go:89] found id: ""
	I1218 00:47:30.121402 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.121409 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:30.121415 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:30.121472 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:30.149519 1201669 cri.go:89] found id: ""
	I1218 00:47:30.149533 1201669 logs.go:282] 0 containers: []
	W1218 00:47:30.149540 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:30.149550 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:30.149565 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:30.177089 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:30.177107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:30.242748 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:30.242767 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:30.257468 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:30.257483 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:30.320728 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:30.312134   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.313121   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.314003   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315432   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:30.315899   16291 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:30.320738 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:30.320749 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:32.889870 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:32.900811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:32.900868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:32.927540 1201669 cri.go:89] found id: ""
	I1218 00:47:32.927553 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.927560 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:32.927565 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:32.927622 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:32.955598 1201669 cri.go:89] found id: ""
	I1218 00:47:32.955611 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.955619 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:32.955623 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:32.955695 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:32.979141 1201669 cri.go:89] found id: ""
	I1218 00:47:32.979155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:32.979162 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:32.979167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:32.979224 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:33.006203 1201669 cri.go:89] found id: ""
	I1218 00:47:33.006218 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.006225 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:33.006230 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:33.006294 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:33.034661 1201669 cri.go:89] found id: ""
	I1218 00:47:33.034675 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.034691 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:33.034697 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:33.034756 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:33.062772 1201669 cri.go:89] found id: ""
	I1218 00:47:33.062786 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.062793 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:33.062804 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:33.062869 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:33.086825 1201669 cri.go:89] found id: ""
	I1218 00:47:33.086839 1201669 logs.go:282] 0 containers: []
	W1218 00:47:33.086846 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:33.086871 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:33.086881 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:33.156565 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:33.156585 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:33.185756 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:33.185772 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:33.256648 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:33.256666 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:33.271243 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:33.271259 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:33.337446 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:33.329367   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.330183   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.331701   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.332168   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:33.333643   16398 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:35.839102 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:35.850275 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:35.850343 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:35.875276 1201669 cri.go:89] found id: ""
	I1218 00:47:35.875289 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.875296 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:35.875301 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:35.875361 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:35.912387 1201669 cri.go:89] found id: ""
	I1218 00:47:35.912400 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.912407 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:35.912412 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:35.912471 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:35.942361 1201669 cri.go:89] found id: ""
	I1218 00:47:35.942379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.942394 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:35.942400 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:35.942499 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:35.972562 1201669 cri.go:89] found id: ""
	I1218 00:47:35.972575 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.972584 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:35.972588 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:35.972644 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:35.998846 1201669 cri.go:89] found id: ""
	I1218 00:47:35.998861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:35.998868 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:35.998874 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:35.998952 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:36.030184 1201669 cri.go:89] found id: ""
	I1218 00:47:36.030197 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.030213 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:36.030219 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:36.030292 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:36.055609 1201669 cri.go:89] found id: ""
	I1218 00:47:36.055624 1201669 logs.go:282] 0 containers: []
	W1218 00:47:36.055640 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:36.055648 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:36.055658 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:36.128355 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:36.128374 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:36.159887 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:36.159904 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:36.229693 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:36.229712 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:36.244397 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:36.244412 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:36.308352 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:36.300670   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.301057   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.302871   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.303201   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:36.304687   16507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:38.808637 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:38.819085 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:38.819152 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:38.844745 1201669 cri.go:89] found id: ""
	I1218 00:47:38.844758 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.844766 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:38.844771 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:38.844827 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:38.869442 1201669 cri.go:89] found id: ""
	I1218 00:47:38.869456 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.869463 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:38.869469 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:38.869531 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:38.899129 1201669 cri.go:89] found id: ""
	I1218 00:47:38.899151 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.899158 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:38.899163 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:38.899232 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:38.929158 1201669 cri.go:89] found id: ""
	I1218 00:47:38.929171 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.929178 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:38.929184 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:38.929250 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:38.959987 1201669 cri.go:89] found id: ""
	I1218 00:47:38.960016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.960023 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:38.960029 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:38.960093 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:38.987077 1201669 cri.go:89] found id: ""
	I1218 00:47:38.987091 1201669 logs.go:282] 0 containers: []
	W1218 00:47:38.987098 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:38.987104 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:38.987160 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:39.015227 1201669 cri.go:89] found id: ""
	I1218 00:47:39.015240 1201669 logs.go:282] 0 containers: []
	W1218 00:47:39.015257 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:39.015266 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:39.015278 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:39.044299 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:39.044322 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:39.110657 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:39.110677 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:39.127155 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:39.127171 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:39.195223 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:39.187402   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.188129   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.189814   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.190338   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:39.191373   16610 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:39.195233 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:39.195243 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:41.762478 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:41.772539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:41.772600 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:41.797940 1201669 cri.go:89] found id: ""
	I1218 00:47:41.797954 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.797961 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:41.797967 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:41.798024 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:41.823230 1201669 cri.go:89] found id: ""
	I1218 00:47:41.823244 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.823251 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:41.823256 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:41.823314 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:41.848348 1201669 cri.go:89] found id: ""
	I1218 00:47:41.848368 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.848384 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:41.848390 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:41.848447 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:41.873186 1201669 cri.go:89] found id: ""
	I1218 00:47:41.873199 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.873207 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:41.873212 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:41.873269 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:41.910240 1201669 cri.go:89] found id: ""
	I1218 00:47:41.910253 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.910260 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:41.910265 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:41.910323 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:41.938636 1201669 cri.go:89] found id: ""
	I1218 00:47:41.938649 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.938656 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:41.938661 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:41.938723 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:41.967011 1201669 cri.go:89] found id: ""
	I1218 00:47:41.967024 1201669 logs.go:282] 0 containers: []
	W1218 00:47:41.967031 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:41.967039 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:41.967048 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:42.032273 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:42.032293 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:42.047961 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:42.047977 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:42.129763 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:42.117370   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.118333   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.120584   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.122498   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:42.123155   16703 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:42.129777 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:42.129788 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:42.203638 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:42.203661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:44.747018 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:44.757561 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:44.757666 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:44.782848 1201669 cri.go:89] found id: ""
	I1218 00:47:44.782861 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.782868 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:44.782873 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:44.782930 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:44.812029 1201669 cri.go:89] found id: ""
	I1218 00:47:44.812042 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.812049 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:44.812054 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:44.812111 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:44.835973 1201669 cri.go:89] found id: ""
	I1218 00:47:44.835986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.835994 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:44.835998 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:44.836055 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:44.865506 1201669 cri.go:89] found id: ""
	I1218 00:47:44.865524 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.865532 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:44.865539 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:44.865596 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:44.895590 1201669 cri.go:89] found id: ""
	I1218 00:47:44.895603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.895610 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:44.895615 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:44.895678 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:44.930517 1201669 cri.go:89] found id: ""
	I1218 00:47:44.930531 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.930538 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:44.930544 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:44.930602 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:44.963147 1201669 cri.go:89] found id: ""
	I1218 00:47:44.963161 1201669 logs.go:282] 0 containers: []
	W1218 00:47:44.963168 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:44.963176 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:44.963187 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:45.068693 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:45.053940   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.054717   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.058310   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.059023   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:45.062114   16802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:45.068706 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:45.068718 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:45.150525 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:45.150547 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:45.198775 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:45.198795 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:45.282633 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:45.282655 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:47.798966 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:47.809011 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:47.809070 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:47.836141 1201669 cri.go:89] found id: ""
	I1218 00:47:47.836155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.836161 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:47.836167 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:47.836256 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:47.862554 1201669 cri.go:89] found id: ""
	I1218 00:47:47.862568 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.862575 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:47.862580 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:47.862645 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:47.889972 1201669 cri.go:89] found id: ""
	I1218 00:47:47.889986 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.889992 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:47.889997 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:47.890054 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:47.922142 1201669 cri.go:89] found id: ""
	I1218 00:47:47.922155 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.922162 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:47.922168 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:47.922223 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:47.956979 1201669 cri.go:89] found id: ""
	I1218 00:47:47.956993 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.956999 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:47.957005 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:47.957062 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:47.982938 1201669 cri.go:89] found id: ""
	I1218 00:47:47.982952 1201669 logs.go:282] 0 containers: []
	W1218 00:47:47.982959 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:47.982965 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:47.983027 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:48.014164 1201669 cri.go:89] found id: ""
	I1218 00:47:48.014178 1201669 logs.go:282] 0 containers: []
	W1218 00:47:48.014184 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:48.014192 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:48.014205 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:48.078819 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:48.069986   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.070704   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072405   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.072971   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:48.074617   16907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:48.078831 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:48.078850 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:48.151018 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:48.151045 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:48.178919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:48.178937 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:48.246806 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:48.246828 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:50.762650 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:50.772894 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:50.772953 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:50.798440 1201669 cri.go:89] found id: ""
	I1218 00:47:50.798453 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.798459 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:50.798468 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:50.798525 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:50.824627 1201669 cri.go:89] found id: ""
	I1218 00:47:50.824641 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.824648 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:50.824654 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:50.824713 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:50.849720 1201669 cri.go:89] found id: ""
	I1218 00:47:50.849732 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.849740 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:50.849745 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:50.849802 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:50.873828 1201669 cri.go:89] found id: ""
	I1218 00:47:50.873841 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.873849 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:50.873854 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:50.873910 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:50.905379 1201669 cri.go:89] found id: ""
	I1218 00:47:50.905392 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.905399 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:50.905404 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:50.905461 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:50.935677 1201669 cri.go:89] found id: ""
	I1218 00:47:50.935690 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.935697 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:50.935702 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:50.935774 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:50.970057 1201669 cri.go:89] found id: ""
	I1218 00:47:50.970070 1201669 logs.go:282] 0 containers: []
	W1218 00:47:50.970077 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:50.970085 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:50.970095 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:51.036789 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:51.036810 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:51.051895 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:51.051913 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:51.116641 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:51.108023   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.108946   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110549   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.110884   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:51.112600   17020 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:51.116651 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:51.116663 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:51.186315 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:51.186337 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:53.718450 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:53.728262 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:53.728318 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:53.755772 1201669 cri.go:89] found id: ""
	I1218 00:47:53.755787 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.755793 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:53.755798 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:53.755855 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:53.780839 1201669 cri.go:89] found id: ""
	I1218 00:47:53.780853 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.780860 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:53.780865 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:53.780929 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:53.806552 1201669 cri.go:89] found id: ""
	I1218 00:47:53.806603 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.806611 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:53.806616 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:53.806672 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:53.832361 1201669 cri.go:89] found id: ""
	I1218 00:47:53.832380 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.832401 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:53.832420 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:53.832492 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:53.859241 1201669 cri.go:89] found id: ""
	I1218 00:47:53.859254 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.859262 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:53.859277 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:53.859335 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:53.884714 1201669 cri.go:89] found id: ""
	I1218 00:47:53.884728 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.884735 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:53.884740 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:53.884803 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:53.921003 1201669 cri.go:89] found id: ""
	I1218 00:47:53.921016 1201669 logs.go:282] 0 containers: []
	W1218 00:47:53.921024 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:53.921031 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:53.921041 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:54.003954 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:54.003975 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:54.020878 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:54.020896 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:54.086911 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:54.078669   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.079215   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.080779   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.081236   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:54.082733   17127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:54.086921 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:54.086943 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:54.157859 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:54.157878 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:56.687608 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:56.697675 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:56.697732 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:56.722032 1201669 cri.go:89] found id: ""
	I1218 00:47:56.722045 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.722053 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:56.722058 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:56.722113 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:56.746685 1201669 cri.go:89] found id: ""
	I1218 00:47:56.746698 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.746705 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:56.746712 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:56.746769 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:56.771487 1201669 cri.go:89] found id: ""
	I1218 00:47:56.771500 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.771508 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:56.771515 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:56.771571 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:56.795765 1201669 cri.go:89] found id: ""
	I1218 00:47:56.795778 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.795785 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:56.795790 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:56.795845 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:56.820457 1201669 cri.go:89] found id: ""
	I1218 00:47:56.820470 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.820477 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:56.820482 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:56.820543 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:56.844750 1201669 cri.go:89] found id: ""
	I1218 00:47:56.844764 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.844788 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:56.844794 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:56.844859 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:56.870299 1201669 cri.go:89] found id: ""
	I1218 00:47:56.870312 1201669 logs.go:282] 0 containers: []
	W1218 00:47:56.870319 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:56.870326 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:56.870336 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:56.957977 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:56.949302   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.949890   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951016   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.951644   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:56.954084   17218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:56.957986 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:56.957996 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:47:57.026903 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:47:57.026922 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:47:57.056057 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:57.056072 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:57.122322 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:57.122341 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.637384 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:47:59.647089 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:47:59.647147 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:47:59.675785 1201669 cri.go:89] found id: ""
	I1218 00:47:59.675798 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.675805 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:47:59.675811 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:47:59.675868 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:47:59.700863 1201669 cri.go:89] found id: ""
	I1218 00:47:59.700876 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.700883 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:47:59.700888 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:47:59.700951 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:47:59.726366 1201669 cri.go:89] found id: ""
	I1218 00:47:59.726379 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.726388 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:47:59.726394 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:47:59.726449 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:47:59.754806 1201669 cri.go:89] found id: ""
	I1218 00:47:59.754819 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.754826 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:47:59.754832 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:47:59.754887 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:47:59.779823 1201669 cri.go:89] found id: ""
	I1218 00:47:59.779842 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.779850 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:47:59.779855 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:47:59.779931 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:47:59.809497 1201669 cri.go:89] found id: ""
	I1218 00:47:59.809511 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.809519 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:47:59.809524 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:47:59.809580 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:47:59.834274 1201669 cri.go:89] found id: ""
	I1218 00:47:59.834287 1201669 logs.go:282] 0 containers: []
	W1218 00:47:59.834294 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:47:59.834302 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:47:59.834312 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:47:59.908086 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:47:59.908107 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 00:47:59.923555 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:47:59.923571 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:47:59.996659 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:47:59.988900   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.989392   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.990920   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.991276   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:47:59.992825   17336 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:47:59.996668 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:47:59.996679 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:48:00.245332 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:48:00.245355 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:48:02.854946 1201669 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 00:48:02.865088 1201669 kubeadm.go:602] duration metric: took 4m2.280648529s to restartPrimaryControlPlane
	W1218 00:48:02.865154 1201669 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1218 00:48:02.865291 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:48:03.285302 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:48:03.298386 1201669 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 00:48:03.307630 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:48:03.307686 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:48:03.316384 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:48:03.316392 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:48:03.316448 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:48:03.324266 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:48:03.324330 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:48:03.332001 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:48:03.339756 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:48:03.339811 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:48:03.347895 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.356395 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:48:03.356451 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:48:03.364239 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:48:03.373496 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:48:03.373555 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:48:03.380932 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:48:03.422222 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:48:03.422277 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:48:03.498554 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:48:03.498619 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:48:03.498653 1201669 kubeadm.go:319] OS: Linux
	I1218 00:48:03.498697 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:48:03.498750 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:48:03.498797 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:48:03.498844 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:48:03.498890 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:48:03.498939 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:48:03.498983 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:48:03.499030 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:48:03.499077 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:48:03.575694 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:48:03.575807 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:48:03.575895 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:48:03.584731 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:48:03.590040 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:48:03.590125 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:48:03.590198 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:48:03.590273 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:48:03.590332 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:48:03.590401 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:48:03.590455 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:48:03.590517 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:48:03.590577 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:48:03.590649 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:48:03.590726 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:48:03.590762 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:48:03.590820 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:48:03.968959 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:48:04.492311 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:48:04.657077 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:48:05.347391 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:48:06.111689 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:48:06.112246 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:48:06.114858 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:48:06.118151 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:48:06.118267 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:48:06.118369 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:48:06.118440 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:48:06.133862 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:48:06.134164 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:48:06.143224 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:48:06.143316 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:48:06.143354 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:48:06.274772 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:48:06.274905 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:52:06.274474 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000113635s
	I1218 00:52:06.274499 1201669 kubeadm.go:319] 
	I1218 00:52:06.274555 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:52:06.274586 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:52:06.274697 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:52:06.274703 1201669 kubeadm.go:319] 
	I1218 00:52:06.274816 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:52:06.274846 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:52:06.274874 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:52:06.274877 1201669 kubeadm.go:319] 
	I1218 00:52:06.279422 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:52:06.279849 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:52:06.279958 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:52:06.280242 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1218 00:52:06.280248 1201669 kubeadm.go:319] 
	I1218 00:52:06.280323 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1218 00:52:06.280425 1201669 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000113635s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1218 00:52:06.280513 1201669 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 00:52:06.687216 1201669 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 00:52:06.699735 1201669 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 00:52:06.699788 1201669 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 00:52:06.707587 1201669 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 00:52:06.707598 1201669 kubeadm.go:158] found existing configuration files:
	
	I1218 00:52:06.707647 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1218 00:52:06.715175 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 00:52:06.715229 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 00:52:06.722487 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1218 00:52:06.729668 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 00:52:06.729722 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 00:52:06.736814 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.744131 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 00:52:06.744183 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 00:52:06.751469 1201669 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1218 00:52:06.758728 1201669 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 00:52:06.758782 1201669 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 00:52:06.765652 1201669 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 00:52:06.801363 1201669 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 00:52:06.801639 1201669 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 00:52:06.871618 1201669 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 00:52:06.871677 1201669 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 00:52:06.871709 1201669 kubeadm.go:319] OS: Linux
	I1218 00:52:06.871750 1201669 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 00:52:06.871795 1201669 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 00:52:06.871839 1201669 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 00:52:06.871883 1201669 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 00:52:06.871926 1201669 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 00:52:06.871970 1201669 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 00:52:06.872012 1201669 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 00:52:06.872056 1201669 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 00:52:06.872097 1201669 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 00:52:06.943596 1201669 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 00:52:06.943710 1201669 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 00:52:06.943809 1201669 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 00:52:06.952719 1201669 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 00:52:06.957986 1201669 out.go:252]   - Generating certificates and keys ...
	I1218 00:52:06.958071 1201669 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 00:52:06.958134 1201669 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 00:52:06.958209 1201669 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 00:52:06.958270 1201669 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 00:52:06.958342 1201669 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 00:52:06.958395 1201669 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 00:52:06.958469 1201669 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 00:52:06.958529 1201669 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 00:52:06.958603 1201669 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 00:52:06.958674 1201669 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 00:52:06.958710 1201669 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 00:52:06.958765 1201669 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 00:52:07.159266 1201669 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 00:52:07.543682 1201669 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 00:52:07.621245 1201669 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 00:52:07.789755 1201669 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 00:52:08.258810 1201669 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 00:52:08.259464 1201669 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 00:52:08.262206 1201669 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 00:52:08.265520 1201669 out.go:252]   - Booting up control plane ...
	I1218 00:52:08.265615 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 00:52:08.265696 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 00:52:08.266218 1201669 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 00:52:08.282138 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 00:52:08.282258 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 00:52:08.290066 1201669 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 00:52:08.290407 1201669 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 00:52:08.290607 1201669 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 00:52:08.422232 1201669 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 00:52:08.422344 1201669 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 00:56:08.423339 1201669 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001129518s
	I1218 00:56:08.423364 1201669 kubeadm.go:319] 
	I1218 00:56:08.423420 1201669 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 00:56:08.423452 1201669 kubeadm.go:319] 	- The kubelet is not running
	I1218 00:56:08.423565 1201669 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 00:56:08.423570 1201669 kubeadm.go:319] 
	I1218 00:56:08.423755 1201669 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 00:56:08.423825 1201669 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 00:56:08.423872 1201669 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 00:56:08.423876 1201669 kubeadm.go:319] 
	I1218 00:56:08.428596 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 00:56:08.429049 1201669 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 00:56:08.429151 1201669 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 00:56:08.429380 1201669 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 00:56:08.429383 1201669 kubeadm.go:319] 
	I1218 00:56:08.429447 1201669 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 00:56:08.429502 1201669 kubeadm.go:403] duration metric: took 12m7.881074518s to StartCluster
	I1218 00:56:08.429533 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 00:56:08.429592 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 00:56:08.454446 1201669 cri.go:89] found id: ""
	I1218 00:56:08.454459 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.454467 1201669 logs.go:284] No container was found matching "kube-apiserver"
	I1218 00:56:08.454472 1201669 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 00:56:08.454527 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 00:56:08.479309 1201669 cri.go:89] found id: ""
	I1218 00:56:08.479323 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.479330 1201669 logs.go:284] No container was found matching "etcd"
	I1218 00:56:08.479335 1201669 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 00:56:08.479395 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 00:56:08.506727 1201669 cri.go:89] found id: ""
	I1218 00:56:08.506740 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.506747 1201669 logs.go:284] No container was found matching "coredns"
	I1218 00:56:08.506752 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 00:56:08.506809 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 00:56:08.531214 1201669 cri.go:89] found id: ""
	I1218 00:56:08.531228 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.531235 1201669 logs.go:284] No container was found matching "kube-scheduler"
	I1218 00:56:08.531240 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 00:56:08.531295 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 00:56:08.555634 1201669 cri.go:89] found id: ""
	I1218 00:56:08.555647 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.555654 1201669 logs.go:284] No container was found matching "kube-proxy"
	I1218 00:56:08.555659 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 00:56:08.555716 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 00:56:08.580409 1201669 cri.go:89] found id: ""
	I1218 00:56:08.580423 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.580430 1201669 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 00:56:08.580435 1201669 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 00:56:08.580494 1201669 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 00:56:08.605063 1201669 cri.go:89] found id: ""
	I1218 00:56:08.605089 1201669 logs.go:282] 0 containers: []
	W1218 00:56:08.605096 1201669 logs.go:284] No container was found matching "kindnet"
	I1218 00:56:08.605105 1201669 logs.go:123] Gathering logs for describe nodes ...
	I1218 00:56:08.605116 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 00:56:08.684346 1201669 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1218 00:56:08.676135   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.676972   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.678697   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.679001   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:56:08.680480   21114 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 00:56:08.684356 1201669 logs.go:123] Gathering logs for CRI-O ...
	I1218 00:56:08.684367 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 00:56:08.760495 1201669 logs.go:123] Gathering logs for container status ...
	I1218 00:56:08.760515 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 00:56:08.787919 1201669 logs.go:123] Gathering logs for kubelet ...
	I1218 00:56:08.787936 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 00:56:08.853642 1201669 logs.go:123] Gathering logs for dmesg ...
	I1218 00:56:08.853661 1201669 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	W1218 00:56:08.868901 1201669 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 00:56:08.868939 1201669 out.go:285] * 
	W1218 00:56:08.868999 1201669 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.869015 1201669 out.go:285] * 
	W1218 00:56:08.871456 1201669 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 00:56:08.877860 1201669 out.go:203] 
	W1218 00:56:08.880779 1201669 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001129518s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 00:56:08.880832 1201669 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 00:56:08.880854 1201669 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 00:56:08.883989 1201669 out.go:203] 
	
	
	==> CRI-O <==
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113118431Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113153129Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113189559Z" level=info msg="Create NRI interface"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113282086Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113290964Z" level=info msg="runtime interface created"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113301647Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113309343Z" level=info msg="runtime interface starting up..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113315505Z" level=info msg="starting plugins..."
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.113327796Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 00:43:59 functional-288604 crio[9949]: time="2025-12-18T00:43:59.11339067Z" level=info msg="No systemd watchdog enabled"
	Dec 18 00:43:59 functional-288604 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.578897723Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=a394bef7-706e-4c2b-a83c-e7a192425f8f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.579569606Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=0b73d3f0-8cf4-4881-9be6-303c65310a78 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58003914Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=ba617c6c-560d-48a4-8069-49b5cad617df name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.58069138Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=1b435c90-bcae-4d5e-85b5-8f24b84aad77 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581151364Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=1ca4dc15-0b08-49d0-89ca-728ba68fd7be name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581562446Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=9758cff4-6113-4178-8c9f-4ef34a0e91ee name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:48:03 functional-288604 crio[9949]: time="2025-12-18T00:48:03.581979017Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=34a384cc-3abb-4525-b194-0557e1231baf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:58:24.264182   22749 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:24.264885   22749 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:24.266565   22749 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:24.267098   22749 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:24.268800   22749 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:58:24 up  7:40,  0 user,  load average: 0.32, 0.22, 0.36
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:58:21 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:22 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2306.
	Dec 18 00:58:22 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:22 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:22 functional-288604 kubelet[22640]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:22 functional-288604 kubelet[22640]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:22 functional-288604 kubelet[22640]: E1218 00:58:22.188479   22640 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:22 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:22 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:22 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2307.
	Dec 18 00:58:22 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:22 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:22 functional-288604 kubelet[22645]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:22 functional-288604 kubelet[22645]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:22 functional-288604 kubelet[22645]: E1218 00:58:22.947340   22645 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:22 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:22 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:23 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2308.
	Dec 18 00:58:23 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:23 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:23 functional-288604 kubelet[22666]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:23 functional-288604 kubelet[22666]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:23 functional-288604 kubelet[22666]: E1218 00:58:23.714384   22666 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:23 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:23 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (380.364398ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect (2.36s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim (241.66s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1218 00:56:27.143887 1159552 retry.go:31] will retry after 4.37839318s: Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1218 00:56:41.523499 1159552 retry.go:31] will retry after 5.410812696s: Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1218 00:56:56.935335 1159552 retry.go:31] will retry after 8.979324788s: Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1218 00:57:15.916693 1159552 retry.go:31] will retry after 6.143202367s: Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1218 00:57:32.061065 1159552 retry.go:31] will retry after 14.14560691s: Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1218 00:57:56.207804 1159552 retry.go:31] will retry after 16.310780012s: Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1218 00:58:19.390579 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1218 00:59:32.022027 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: client rate limiter Wait returned an error: context deadline exceeded
functional_test_pvc_test.go:50: ***** TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: pod "integration-test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:50: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
functional_test_pvc_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (304.963362ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
functional_test_pvc_test.go:50: status error: exit status 2 (may be ok)
functional_test_pvc_test.go:50: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
functional_test_pvc_test.go:51: failed waiting for storage-provisioner: integration-test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (337.759161ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                           ARGS                                                                            │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image save kicbase/echo-server:functional-288604 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image rm kicbase/echo-server:functional-288604 --alsologtostderr                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image save --daemon kicbase/echo-server:functional-288604 --alsologtostderr                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /etc/ssl/certs/1159552.pem                                                                                                 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /usr/share/ca-certificates/1159552.pem                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /etc/ssl/certs/11595522.pem                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /usr/share/ca-certificates/11595522.pem                                                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh sudo cat /etc/test/nested/copy/1159552/hosts                                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls --format short --alsologtostderr                                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls --format yaml --alsologtostderr                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh            │ functional-288604 ssh pgrep buildkitd                                                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ image          │ functional-288604 image build -t localhost/my-image:functional-288604 testdata/build --alsologtostderr                                                    │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls --format json --alsologtostderr                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image          │ functional-288604 image ls --format table --alsologtostderr                                                                                               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ update-context │ functional-288604 update-context --alsologtostderr -v=2                                                                                                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ update-context │ functional-288604 update-context --alsologtostderr -v=2                                                                                                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ update-context │ functional-288604 update-context --alsologtostderr -v=2                                                                                                   │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:58:39
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:58:39.583243 1219115 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:58:39.583636 1219115 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.583653 1219115 out.go:374] Setting ErrFile to fd 2...
	I1218 00:58:39.583659 1219115 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.584016 1219115 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:58:39.584840 1219115 out.go:368] Setting JSON to false
	I1218 00:58:39.585815 1219115 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":27668,"bootTime":1765991852,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:58:39.585921 1219115 start.go:143] virtualization:  
	I1218 00:58:39.589258 1219115 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:58:39.593010 1219115 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:58:39.593078 1219115 notify.go:221] Checking for updates...
	I1218 00:58:39.598753 1219115 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:58:39.601710 1219115 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:58:39.604519 1219115 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:58:39.607336 1219115 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:58:39.610130 1219115 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:58:39.613458 1219115 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:58:39.614052 1219115 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:58:39.641766 1219115 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:58:39.641879 1219115 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.697740 1219115 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.688189308 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.697856 1219115 docker.go:319] overlay module found
	I1218 00:58:39.701039 1219115 out.go:179] * Using the docker driver based on existing profile
	I1218 00:58:39.703916 1219115 start.go:309] selected driver: docker
	I1218 00:58:39.703938 1219115 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bin
aryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.704044 1219115 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:58:39.704152 1219115 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.758556 1219115 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.749355176 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.758951 1219115 cni.go:84] Creating CNI manager for ""
	I1218 00:58:39.759012 1219115 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:58:39.759062 1219115 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.762264 1219115 out.go:179] * dry-run validation complete!
	
	
	==> CRI-O <==
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.254633749Z" level=info msg="Checking image status: kicbase/echo-server:functional-288604" id=b14f9da8-e72d-4a0b-84dc-6bd92a957bb4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.254835228Z" level=info msg="Resolving \"kicbase/echo-server\" using unqualified-search registries (/etc/containers/registries.conf.d/crio.conf)"
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.254880421Z" level=info msg="Image kicbase/echo-server:functional-288604 not found" id=b14f9da8-e72d-4a0b-84dc-6bd92a957bb4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.25496105Z" level=info msg="Neither image nor artfiact kicbase/echo-server:functional-288604 found" id=b14f9da8-e72d-4a0b-84dc-6bd92a957bb4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.279914787Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:functional-288604" id=2b29d48c-c28f-4a16-879c-f277f28d84fd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.28013042Z" level=info msg="Image docker.io/kicbase/echo-server:functional-288604 not found" id=2b29d48c-c28f-4a16-879c-f277f28d84fd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.280176925Z" level=info msg="Neither image nor artfiact docker.io/kicbase/echo-server:functional-288604 found" id=2b29d48c-c28f-4a16-879c-f277f28d84fd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.304389425Z" level=info msg="Checking image status: localhost/kicbase/echo-server:functional-288604" id=286a01c9-eddc-45d9-992c-e9717ea1afc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.304522196Z" level=info msg="Image localhost/kicbase/echo-server:functional-288604 not found" id=286a01c9-eddc-45d9-992c-e9717ea1afc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.304557551Z" level=info msg="Neither image nor artfiact localhost/kicbase/echo-server:functional-288604 found" id=286a01c9-eddc-45d9-992c-e9717ea1afc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354430814Z" level=info msg="Checking image status: kicbase/echo-server:functional-288604" id=4d4b495a-c88d-4687-bcab-f0cf85b12466 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354596134Z" level=info msg="Resolving \"kicbase/echo-server\" using unqualified-search registries (/etc/containers/registries.conf.d/crio.conf)"
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354673876Z" level=info msg="Image kicbase/echo-server:functional-288604 not found" id=4d4b495a-c88d-4687-bcab-f0cf85b12466 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354749599Z" level=info msg="Neither image nor artfiact kicbase/echo-server:functional-288604 found" id=4d4b495a-c88d-4687-bcab-f0cf85b12466 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.382209847Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:functional-288604" id=2d4be5e8-7f75-4b2f-bf8a-8688deee06a9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.382343676Z" level=info msg="Image docker.io/kicbase/echo-server:functional-288604 not found" id=2d4be5e8-7f75-4b2f-bf8a-8688deee06a9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.38238018Z" level=info msg="Neither image nor artfiact docker.io/kicbase/echo-server:functional-288604 found" id=2d4be5e8-7f75-4b2f-bf8a-8688deee06a9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.406506815Z" level=info msg="Checking image status: localhost/kicbase/echo-server:functional-288604" id=9f001423-fa42-4161-a7dd-f1e36d7025f7 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 01:00:18.727744   25382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 01:00:18.728458   25382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 01:00:18.729955   25382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 01:00:18.730559   25382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 01:00:18.732060   25382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 01:00:18 up  7:42,  0 user,  load average: 0.36, 0.39, 0.42
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 01:00:16 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 01:00:17 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2459.
	Dec 18 01:00:17 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:00:17 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:00:17 functional-288604 kubelet[25260]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:00:17 functional-288604 kubelet[25260]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:00:17 functional-288604 kubelet[25260]: E1218 01:00:17.191492   25260 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 01:00:17 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 01:00:17 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 01:00:17 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2460.
	Dec 18 01:00:17 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:00:17 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:00:17 functional-288604 kubelet[25280]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:00:17 functional-288604 kubelet[25280]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:00:17 functional-288604 kubelet[25280]: E1218 01:00:17.975549   25280 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 01:00:17 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 01:00:17 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 01:00:18 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2461.
	Dec 18 01:00:18 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:00:18 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:00:18 functional-288604 kubelet[25373]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:00:18 functional-288604 kubelet[25373]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:00:18 functional-288604 kubelet[25373]: E1218 01:00:18.699534   25373 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 01:00:18 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 01:00:18 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (309.595347ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim (241.66s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels (1.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-288604 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
functional_test.go:234: (dbg) Non-zero exit: kubectl --context functional-288604 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": exit status 1 (59.054914ms)

                                                
                                                
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:236: failed to 'kubectl get nodes' with args "kubectl --context functional-288604 get nodes --output=go-template \"--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'\"": exit status 1
functional_test.go:242: expected to have label "minikube.k8s.io/commit" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/version" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/updated_at" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/name" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/primary" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-288604
helpers_test.go:244: (dbg) docker inspect functional-288604:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	        "Created": "2025-12-18T00:29:14.364658737Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1190310,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T00:29:14.421583796Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hostname",
	        "HostsPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/hosts",
	        "LogPath": "/var/lib/docker/containers/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7/421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7-json.log",
	        "Name": "/functional-288604",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-288604:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-288604",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "421416a6c407469ade8a0acfb869647edd4c6380c3e46dcb9b6eb5faf9c152e7",
	                "LowerDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/merged",
	                "UpperDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/diff",
	                "WorkDir": "/var/lib/docker/overlay2/655fcd95dd7599a0622587dc41c42912b2606256f986f10173e4414a994c7fdd/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-288604",
	                "Source": "/var/lib/docker/volumes/functional-288604/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-288604",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-288604",
	                "name.minikube.sigs.k8s.io": "functional-288604",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "2e04c93d0967d8c5bac5200abe4456cf96fc0454d87881529427725525a8db4b",
	            "SandboxKey": "/var/run/docker/netns/2e04c93d0967",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33925"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33926"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33929"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33927"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33928"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-288604": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "52:f5:fc:ac:48:e1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "a1ab1ee989c3c3b500cebc253e14ed97fdea30d4b87fac26cd1d6dacd50faae4",
	                    "EndpointID": "c6e80d40075aa4082130f1795580c51aab7cf34c510037dba385d9716160eac5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-288604",
	                        "421416a6c407"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-288604 -n functional-288604: exit status 2 (312.987016ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                           ARGS                                                                            │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ mount     │ -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount3 --alsologtostderr -v=1                      │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount1                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh findmnt -T /mount1                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh findmnt -T /mount2                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh findmnt -T /mount3                                                                                                                  │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ mount     │ -p functional-288604 --kill=true                                                                                                                          │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ start     │ -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ start     │ -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1               │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ start     │ -p functional-288604 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1                         │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-288604 --alsologtostderr -v=1                                                                                            │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ license   │                                                                                                                                                           │ minikube          │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ ssh       │ functional-288604 ssh sudo systemctl is-active docker                                                                                                     │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ ssh       │ functional-288604 ssh sudo systemctl is-active containerd                                                                                                 │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │                     │
	│ image     │ functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image save kicbase/echo-server:functional-288604 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image rm kicbase/echo-server:functional-288604 --alsologtostderr                                                                        │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image ls                                                                                                                                │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	│ image     │ functional-288604 image save --daemon kicbase/echo-server:functional-288604 --alsologtostderr                                                             │ functional-288604 │ jenkins │ v1.37.0 │ 18 Dec 25 00:58 UTC │ 18 Dec 25 00:58 UTC │
	└───────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:58:39
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:58:39.583243 1219115 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:58:39.583636 1219115 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.583653 1219115 out.go:374] Setting ErrFile to fd 2...
	I1218 00:58:39.583659 1219115 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.584016 1219115 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:58:39.584840 1219115 out.go:368] Setting JSON to false
	I1218 00:58:39.585815 1219115 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":27668,"bootTime":1765991852,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:58:39.585921 1219115 start.go:143] virtualization:  
	I1218 00:58:39.589258 1219115 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:58:39.593010 1219115 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:58:39.593078 1219115 notify.go:221] Checking for updates...
	I1218 00:58:39.598753 1219115 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:58:39.601710 1219115 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:58:39.604519 1219115 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:58:39.607336 1219115 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:58:39.610130 1219115 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:58:39.613458 1219115 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:58:39.614052 1219115 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:58:39.641766 1219115 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:58:39.641879 1219115 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.697740 1219115 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.688189308 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.697856 1219115 docker.go:319] overlay module found
	I1218 00:58:39.701039 1219115 out.go:179] * Using the docker driver based on existing profile
	I1218 00:58:39.703916 1219115 start.go:309] selected driver: docker
	I1218 00:58:39.703938 1219115 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bin
aryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.704044 1219115 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:58:39.704152 1219115 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.758556 1219115 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.749355176 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.758951 1219115 cni.go:84] Creating CNI manager for ""
	I1218 00:58:39.759012 1219115 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:58:39.759062 1219115 start.go:353] cluster config:
	{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.762264 1219115 out.go:179] * dry-run validation complete!
	
	
	==> CRI-O <==
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.946872801Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=ede262bb-aa24-43f4-acb8-56a983b40b94 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.947558336Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=bc3be1a2-0177-4d93-a4c5-aaa9ffd553ae name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948135017Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=abe400b9-a088-4251-abf4-5ea417b9beaf name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.948611836Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=17f9e683-6615-4c3f-b210-328b50ea255a name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949049075Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=abf31b7e-df55-4588-bf3f-b260bc7bb900 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.949482524Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=67a32504-1224-4572-bb6c-29616b8546f2 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:52:06 functional-288604 crio[9949]: time="2025-12-18T00:52:06.94989084Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d4c85de9-9231-44f1-a9ab-86962d2bbdbd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.254633749Z" level=info msg="Checking image status: kicbase/echo-server:functional-288604" id=b14f9da8-e72d-4a0b-84dc-6bd92a957bb4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.254835228Z" level=info msg="Resolving \"kicbase/echo-server\" using unqualified-search registries (/etc/containers/registries.conf.d/crio.conf)"
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.254880421Z" level=info msg="Image kicbase/echo-server:functional-288604 not found" id=b14f9da8-e72d-4a0b-84dc-6bd92a957bb4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.25496105Z" level=info msg="Neither image nor artfiact kicbase/echo-server:functional-288604 found" id=b14f9da8-e72d-4a0b-84dc-6bd92a957bb4 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.279914787Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:functional-288604" id=2b29d48c-c28f-4a16-879c-f277f28d84fd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.28013042Z" level=info msg="Image docker.io/kicbase/echo-server:functional-288604 not found" id=2b29d48c-c28f-4a16-879c-f277f28d84fd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.280176925Z" level=info msg="Neither image nor artfiact docker.io/kicbase/echo-server:functional-288604 found" id=2b29d48c-c28f-4a16-879c-f277f28d84fd name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.304389425Z" level=info msg="Checking image status: localhost/kicbase/echo-server:functional-288604" id=286a01c9-eddc-45d9-992c-e9717ea1afc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.304522196Z" level=info msg="Image localhost/kicbase/echo-server:functional-288604 not found" id=286a01c9-eddc-45d9-992c-e9717ea1afc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:43 functional-288604 crio[9949]: time="2025-12-18T00:58:43.304557551Z" level=info msg="Neither image nor artfiact localhost/kicbase/echo-server:functional-288604 found" id=286a01c9-eddc-45d9-992c-e9717ea1afc0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354430814Z" level=info msg="Checking image status: kicbase/echo-server:functional-288604" id=4d4b495a-c88d-4687-bcab-f0cf85b12466 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354596134Z" level=info msg="Resolving \"kicbase/echo-server\" using unqualified-search registries (/etc/containers/registries.conf.d/crio.conf)"
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354673876Z" level=info msg="Image kicbase/echo-server:functional-288604 not found" id=4d4b495a-c88d-4687-bcab-f0cf85b12466 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.354749599Z" level=info msg="Neither image nor artfiact kicbase/echo-server:functional-288604 found" id=4d4b495a-c88d-4687-bcab-f0cf85b12466 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.382209847Z" level=info msg="Checking image status: docker.io/kicbase/echo-server:functional-288604" id=2d4be5e8-7f75-4b2f-bf8a-8688deee06a9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.382343676Z" level=info msg="Image docker.io/kicbase/echo-server:functional-288604 not found" id=2d4be5e8-7f75-4b2f-bf8a-8688deee06a9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.38238018Z" level=info msg="Neither image nor artfiact docker.io/kicbase/echo-server:functional-288604 found" id=2d4be5e8-7f75-4b2f-bf8a-8688deee06a9 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 00:58:46 functional-288604 crio[9949]: time="2025-12-18T00:58:46.406506815Z" level=info msg="Checking image status: localhost/kicbase/echo-server:functional-288604" id=9f001423-fa42-4161-a7dd-f1e36d7025f7 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1218 00:58:48.827163   24127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:48.827915   24127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:48.829446   24127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:48.829913   24127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1218 00:58:48.831380   24127 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec18 00:11] kauditd_printk_skb: 8 callbacks suppressed
	[Dec18 00:13] overlayfs: idmapped layers are currently not supported
	[Dec18 00:18] overlayfs: idmapped layers are currently not supported
	[Dec18 00:19] overlayfs: idmapped layers are currently not supported
	[Dec18 00:43] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 00:58:48 up  7:41,  0 user,  load average: 1.43, 0.50, 0.45
	Linux functional-288604 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 00:58:46 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:46 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2339.
	Dec 18 00:58:46 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:46 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:46 functional-288604 kubelet[23947]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:46 functional-288604 kubelet[23947]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:46 functional-288604 kubelet[23947]: E1218 00:58:46.950609   23947 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:46 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:46 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:47 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2340.
	Dec 18 00:58:47 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:47 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:47 functional-288604 kubelet[23997]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:47 functional-288604 kubelet[23997]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:47 functional-288604 kubelet[23997]: E1218 00:58:47.713261   23997 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:47 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:47 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 00:58:48 functional-288604 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2341.
	Dec 18 00:58:48 functional-288604 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:48 functional-288604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 00:58:48 functional-288604 kubelet[24042]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:48 functional-288604 kubelet[24042]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 00:58:48 functional-288604 kubelet[24042]: E1218 00:58:48.452301   24042 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 00:58:48 functional-288604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 00:58:48 functional-288604 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-288604 -n functional-288604: exit status 2 (342.889127ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-288604" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels (1.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel (0.55s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr]
functional_test_tunnel_test.go:190: tunnel command failed with unexpected error: exit code 103. stderr: I1218 00:56:16.594495 1214728 out.go:360] Setting OutFile to fd 1 ...
I1218 00:56:16.594651 1214728 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:56:16.594679 1214728 out.go:374] Setting ErrFile to fd 2...
I1218 00:56:16.594699 1214728 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:56:16.594962 1214728 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:56:16.595229 1214728 mustload.go:66] Loading cluster: functional-288604
I1218 00:56:16.595679 1214728 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:56:16.596184 1214728 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:56:16.621996 1214728 host.go:66] Checking if "functional-288604" exists ...
I1218 00:56:16.622300 1214728 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1218 00:56:16.728333 1214728 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:56:16.716177789 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1218 00:56:16.728443 1214728 api_server.go:166] Checking apiserver status ...
I1218 00:56:16.728502 1214728 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1218 00:56:16.728545 1214728 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:56:16.771620 1214728 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
W1218 00:56:16.899331 1214728 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1218 00:56:16.904371 1214728 out.go:179] * The control-plane node functional-288604 apiserver is not running: (state=Stopped)
I1218 00:56:16.907450 1214728 out.go:179]   To start a cluster, run: "minikube start -p functional-288604"

                                                
                                                
stdout: * The control-plane node functional-288604 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-288604"
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] stderr:
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] ...
helpers_test.go:526: unable to kill pid 1214727: os: process already finished
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] stderr:
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel (0.55s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-288604 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:212: (dbg) Non-zero exit: kubectl --context functional-288604 apply -f testdata/testsvc.yaml: exit status 1 (102.995821ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/testsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:214: kubectl --context functional-288604 apply -f testdata/testsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect (125.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:288: failed to hit nginx at "http://10.101.219.144": Temporary Error: Get "http://10.101.219.144": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-288604 get svc nginx-svc
functional_test_tunnel_test.go:290: (dbg) Non-zero exit: kubectl --context functional-288604 get svc nginx-svc: exit status 1 (57.256134ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:292: kubectl --context functional-288604 get svc nginx-svc failed: exit status 1
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect (125.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-288604 create deployment hello-node --image kicbase/echo-server
functional_test.go:1451: (dbg) Non-zero exit: kubectl --context functional-288604 create deployment hello-node --image kicbase/echo-server: exit status 1 (61.121242ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1453: failed to create hello-node deployment with this command "kubectl --context functional-288604 create deployment hello-node --image kicbase/echo-server": exit status 1.
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 service list
functional_test.go:1469: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 service list: exit status 103 (262.477231ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-288604 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-288604"

                                                
                                                
-- /stdout --
functional_test.go:1471: failed to do service list. args "out/minikube-linux-arm64 -p functional-288604 service list" : exit status 103
functional_test.go:1474: expected 'service list' to contain *hello-node* but got -"* The control-plane node functional-288604 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-288604\"\n"-
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 service list -o json
functional_test.go:1499: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 service list -o json: exit status 103 (274.821501ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-288604 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-288604"

                                                
                                                
-- /stdout --
functional_test.go:1501: failed to list services with json format. args "out/minikube-linux-arm64 -p functional-288604 service list -o json": exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 service --namespace=default --https --url hello-node
functional_test.go:1519: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 service --namespace=default --https --url hello-node: exit status 103 (281.926211ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-288604 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-288604"

                                                
                                                
-- /stdout --
functional_test.go:1521: failed to get service url. args "out/minikube-linux-arm64 -p functional-288604 service --namespace=default --https --url hello-node" : exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 service hello-node --url --format={{.IP}}
functional_test.go:1550: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 service hello-node --url --format={{.IP}}: exit status 103 (253.700667ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-288604 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-288604"

                                                
                                                
-- /stdout --
functional_test.go:1552: failed to get service url with custom format. args "out/minikube-linux-arm64 -p functional-288604 service hello-node --url --format={{.IP}}": exit status 103
functional_test.go:1558: "* The control-plane node functional-288604 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-288604\"" is not a valid IP
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 service hello-node --url
functional_test.go:1569: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 service hello-node --url: exit status 103 (256.830611ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-288604 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-288604"

                                                
                                                
-- /stdout --
functional_test.go:1571: failed to get service url. args: "out/minikube-linux-arm64 -p functional-288604 service hello-node --url": exit status 103
functional_test.go:1575: found endpoint for hello-node: * The control-plane node functional-288604 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-288604"
functional_test.go:1579: failed to parse "* The control-plane node functional-288604 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-288604\"": parse "* The control-plane node functional-288604 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-288604\"": net/url: invalid control character in URL
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port (2.33s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1766019510054259136" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1766019510054259136" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1766019510054259136" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001/test-1766019510054259136
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (332.609998ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1218 00:58:30.387145 1159552 retry.go:31] will retry after 434.10396ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 18 00:58 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 18 00:58 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 18 00:58 test-1766019510054259136
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh cat /mount-9p/test-1766019510054259136
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-288604 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:148: (dbg) Non-zero exit: kubectl --context functional-288604 replace --force -f testdata/busybox-mount-test.yaml: exit status 1 (57.674099ms)

                                                
                                                
** stderr ** 
	error: error when deleting "testdata/busybox-mount-test.yaml": Delete "https://192.168.49.2:8441/api/v1/namespaces/default/pods/busybox-mount": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test_mount_test.go:150: failed to 'kubectl replace' for busybox-mount-test. args "kubectl --context functional-288604 replace --force -f testdata/busybox-mount-test.yaml" : exit status 1
functional_test_mount_test.go:80: "TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:81: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:81: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates": exit status 1 (264.493087ms)

                                                
                                                
-- stdout --
	192.168.49.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=997,access=any,msize=262144,trans=tcp,noextend,port=39989)
	total 2
	-rw-r--r-- 1 docker docker 24 Dec 18 00:58 created-by-test
	-rw-r--r-- 1 docker docker 24 Dec 18 00:58 created-by-test-removed-by-pod
	-rw-r--r-- 1 docker docker 24 Dec 18 00:58 test-1766019510054259136
	cat: /mount-9p/pod-dates: No such file or directory

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:83: debugging command "out/minikube-linux-arm64 -p functional-288604 ssh \"mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates\"" failed : exit status 1
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.49.1:39989
* Userspace file server: 
ufs starting
* Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                

                                                
                                                
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001:/mount-9p --alsologtostderr -v=1] stderr:
I1218 00:58:30.123605 1217178 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:30.124038 1217178 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:30.124049 1217178 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:30.124053 1217178 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:30.124336 1217178 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:30.124592 1217178 mustload.go:66] Loading cluster: functional-288604
I1218 00:58:30.124968 1217178 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:30.125460 1217178 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:30.144470 1217178 host.go:66] Checking if "functional-288604" exists ...
I1218 00:58:30.144723 1217178 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1218 00:58:30.232774 1217178 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:30.220567347 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1218 00:58:30.232931 1217178 cli_runner.go:164] Run: docker network inspect functional-288604 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1218 00:58:30.261115 1217178 out.go:179] * Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001 into VM as /mount-9p ...
I1218 00:58:30.264300 1217178 out.go:179]   - Mount type:   9p
I1218 00:58:30.267181 1217178 out.go:179]   - User ID:      docker
I1218 00:58:30.270416 1217178 out.go:179]   - Group ID:     docker
I1218 00:58:30.273575 1217178 out.go:179]   - Version:      9p2000.L
I1218 00:58:30.276582 1217178 out.go:179]   - Message Size: 262144
I1218 00:58:30.279432 1217178 out.go:179]   - Options:      map[]
I1218 00:58:30.282510 1217178 out.go:179]   - Bind Address: 192.168.49.1:39989
I1218 00:58:30.285229 1217178 out.go:179] * Userspace file server: 
I1218 00:58:30.285510 1217178 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1218 00:58:30.285599 1217178 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:30.313659 1217178 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
I1218 00:58:30.424105 1217178 mount.go:180] unmount for /mount-9p ran successfully
I1218 00:58:30.424130 1217178 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I1218 00:58:30.433887 1217178 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=39989,trans=tcp,version=9p2000.L 192.168.49.1 /mount-9p"
I1218 00:58:30.446634 1217178 main.go:127] stdlog: ufs.go:141 connected
I1218 00:58:30.446795 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tversion tag 65535 msize 262144 version '9P2000.L'
I1218 00:58:30.446842 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rversion tag 65535 msize 262144 version '9P2000'
I1218 00:58:30.447052 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I1218 00:58:30.447113 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rattach tag 0 aqid (3b6250 2ef72f20 'd')
I1218 00:58:30.447386 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 0
I1218 00:58:30.447445 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (3b6250 2ef72f20 'd') m d775 at 0 mt 1766019510 l 4096 t 0 d 0 ext )
I1218 00:58:30.452003 1217178 lock.go:50] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/.mount-process: {Name:mk2ef9777c0ff7275b9d154e1a92ce196031ab39 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1218 00:58:30.452184 1217178 mount.go:105] mount successful: ""
I1218 00:58:30.455671 1217178 out.go:179] * Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun240716813/001 to /mount-9p
I1218 00:58:30.458366 1217178 out.go:203] 
I1218 00:58:30.461088 1217178 out.go:179] * NOTE: This process must stay alive for the mount to be accessible ...
I1218 00:58:31.388594 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 0
I1218 00:58:31.388677 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (3b6250 2ef72f20 'd') m d775 at 0 mt 1766019510 l 4096 t 0 d 0 ext )
I1218 00:58:31.389020 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 1 
I1218 00:58:31.389057 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 
I1218 00:58:31.389197 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Topen tag 0 fid 1 mode 0
I1218 00:58:31.389252 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Ropen tag 0 qid (3b6250 2ef72f20 'd') iounit 0
I1218 00:58:31.389380 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 0
I1218 00:58:31.389413 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (3b6250 2ef72f20 'd') m d775 at 0 mt 1766019510 l 4096 t 0 d 0 ext )
I1218 00:58:31.389572 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 0 count 262120
I1218 00:58:31.389693 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 258
I1218 00:58:31.389830 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 258 count 261862
I1218 00:58:31.389863 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.390003 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 258 count 262120
I1218 00:58:31.390030 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.390167 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1218 00:58:31.390202 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6251 2ef72f20 '') 
I1218 00:58:31.390321 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.390355 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (3b6251 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.390490 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.390520 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (3b6251 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.390681 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.390712 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.390835 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1218 00:58:31.390869 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6252 2ef72f20 '') 
I1218 00:58:31.390995 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.391026 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (3b6252 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.391144 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.391176 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (3b6252 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.391316 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.391339 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.391462 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 2 0:'test-1766019510054259136' 
I1218 00:58:31.391493 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6253 2ef72f20 '') 
I1218 00:58:31.391616 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.391647 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('test-1766019510054259136' 'jenkins' 'jenkins' '' q (3b6253 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.391761 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.391788 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('test-1766019510054259136' 'jenkins' 'jenkins' '' q (3b6253 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.391918 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.391938 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.392049 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 258 count 262120
I1218 00:58:31.392078 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.392201 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 1
I1218 00:58:31.392248 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.664308 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 1 0:'test-1766019510054259136' 
I1218 00:58:31.664381 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6253 2ef72f20 '') 
I1218 00:58:31.664601 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 1
I1218 00:58:31.664662 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('test-1766019510054259136' 'jenkins' 'jenkins' '' q (3b6253 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.664824 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 1 newfid 2 
I1218 00:58:31.664857 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 
I1218 00:58:31.664974 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Topen tag 0 fid 2 mode 0
I1218 00:58:31.665018 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Ropen tag 0 qid (3b6253 2ef72f20 '') iounit 0
I1218 00:58:31.665138 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 1
I1218 00:58:31.665203 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('test-1766019510054259136' 'jenkins' 'jenkins' '' q (3b6253 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.665346 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 2 offset 0 count 262120
I1218 00:58:31.665390 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 24
I1218 00:58:31.665492 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 2 offset 24 count 262120
I1218 00:58:31.665519 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.665647 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 2 offset 24 count 262120
I1218 00:58:31.665679 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.665829 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.665858 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.666037 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 1
I1218 00:58:31.666071 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.990717 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 0
I1218 00:58:31.990790 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (3b6250 2ef72f20 'd') m d775 at 0 mt 1766019510 l 4096 t 0 d 0 ext )
I1218 00:58:31.991110 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 1 
I1218 00:58:31.991144 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 
I1218 00:58:31.991282 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Topen tag 0 fid 1 mode 0
I1218 00:58:31.991330 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Ropen tag 0 qid (3b6250 2ef72f20 'd') iounit 0
I1218 00:58:31.991456 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 0
I1218 00:58:31.991490 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (3b6250 2ef72f20 'd') m d775 at 0 mt 1766019510 l 4096 t 0 d 0 ext )
I1218 00:58:31.991645 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 0 count 262120
I1218 00:58:31.991746 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 258
I1218 00:58:31.991871 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 258 count 261862
I1218 00:58:31.991897 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.992029 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 258 count 262120
I1218 00:58:31.992052 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.992178 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1218 00:58:31.992246 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6251 2ef72f20 '') 
I1218 00:58:31.992377 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.992412 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (3b6251 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.992547 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.992580 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (3b6251 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.992706 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.992736 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.992900 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1218 00:58:31.992938 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6252 2ef72f20 '') 
I1218 00:58:31.993070 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.993102 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (3b6252 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.993220 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.993251 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (3b6252 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.993383 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.993406 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.993530 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 2 0:'test-1766019510054259136' 
I1218 00:58:31.993560 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rwalk tag 0 (3b6253 2ef72f20 '') 
I1218 00:58:31.993683 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.993718 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('test-1766019510054259136' 'jenkins' 'jenkins' '' q (3b6253 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.993831 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tstat tag 0 fid 2
I1218 00:58:31.993870 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rstat tag 0 st ('test-1766019510054259136' 'jenkins' 'jenkins' '' q (3b6253 2ef72f20 '') m 644 at 0 mt 1766019510 l 24 t 0 d 0 ext )
I1218 00:58:31.994003 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 2
I1218 00:58:31.994025 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.994179 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tread tag 0 fid 1 offset 258 count 262120
I1218 00:58:31.994231 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rread tag 0 count 0
I1218 00:58:31.994384 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 1
I1218 00:58:31.994417 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:31.995495 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I1218 00:58:31.995558 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rerror tag 0 ename 'file not found' ecode 0
I1218 00:58:32.262701 1217178 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:49368 Tclunk tag 0 fid 0
I1218 00:58:32.262770 1217178 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:49368 Rclunk tag 0
I1218 00:58:32.263748 1217178 main.go:127] stdlog: ufs.go:147 disconnected
I1218 00:58:32.283957 1217178 out.go:179] * Unmounting /mount-9p ...
I1218 00:58:32.286888 1217178 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1218 00:58:32.294114 1217178 mount.go:180] unmount for /mount-9p ran successfully
I1218 00:58:32.294220 1217178 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/.mount-process: {Name:mk2ef9777c0ff7275b9d154e1a92ce196031ab39 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1218 00:58:32.297377 1217178 out.go:203] 
W1218 00:58:32.300469 1217178 out.go:285] X Exiting due to MK_INTERRUPTED: Received terminated signal
X Exiting due to MK_INTERRUPTED: Received terminated signal
I1218 00:58:32.303386 1217178 out.go:203] 
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port (2.33s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (1.68s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 pause -p json-output-368600 --output=json --user=testUser
json_output_test.go:63: (dbg) Non-zero exit: out/minikube-linux-arm64 pause -p json-output-368600 --output=json --user=testUser: exit status 80 (1.679524644s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"dd9345cc-800d-4963-a7a8-aac60348f491","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"Pausing node json-output-368600 ...","name":"Pausing","totalsteps":"1"}}
	{"specversion":"1.0","id":"61aee587-7b11-43bd-89ac-d341252aeb26","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"80","issues":"","message":"Pause: list running: runc: sudo runc list -f json: Process exited with status 1\nstdout:\n\nstderr:\ntime=\"2025-12-18T01:10:51Z\" level=error msg=\"open /run/runc: no such file or directory\"","name":"GUEST_PAUSE","url":""}}
	{"specversion":"1.0","id":"25c7842f-1251-4b5d-bdb6-80d19b38ce24","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"message":"╭───────────────────────────────────────────────────────────────────────────────────────────╮\n│                                                                                           │\n│    If the above advice does not help, please let us know:                                 │\n│    https://github.com/kubernetes/minikube/issues/new/choose                               │\n│                                                                                           │\n│    Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │\n│    Please also attach the following f
ile to the GitHub issue:                             │\n│    - /tmp/minikube_pause_49fdaea37aad8ebccb761973c21590cc64efe8d9_0.log                   │\n│                                                                                           │\n╰───────────────────────────────────────────────────────────────────────────────────────────╯"}}

                                                
                                                
-- /stdout --
json_output_test.go:65: failed to clean up: args "out/minikube-linux-arm64 pause -p json-output-368600 --output=json --user=testUser": exit status 80
--- FAIL: TestJSONOutput/pause/Command (1.68s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (2.1s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 unpause -p json-output-368600 --output=json --user=testUser
json_output_test.go:63: (dbg) Non-zero exit: out/minikube-linux-arm64 unpause -p json-output-368600 --output=json --user=testUser: exit status 80 (2.104011697s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"3cada260-7a2b-481a-b076-f8022b46ebba","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"Unpausing node json-output-368600 ...","name":"Unpausing","totalsteps":"1"}}
	{"specversion":"1.0","id":"1fc4b457-e8b4-4694-8e6a-322ceec8d096","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"80","issues":"","message":"Pause: list paused: runc: sudo runc list -f json: Process exited with status 1\nstdout:\n\nstderr:\ntime=\"2025-12-18T01:10:53Z\" level=error msg=\"open /run/runc: no such file or directory\"","name":"GUEST_UNPAUSE","url":""}}
	{"specversion":"1.0","id":"49c28c25-b45f-4e8c-8a1b-570031ed20a5","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"message":"╭───────────────────────────────────────────────────────────────────────────────────────────╮\n│                                                                                           │\n│    If the above advice does not help, please let us know:                                 │\n│    https://github.com/kubernetes/minikube/issues/new/choose                               │\n│                                                                                           │\n│    Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │\n│    Please also attach the following f
ile to the GitHub issue:                             │\n│    - /tmp/minikube_unpause_85c908ac827001a7ced33feb0caf7da086d17584_0.log                 │\n│                                                                                           │\n╰───────────────────────────────────────────────────────────────────────────────────────────╯"}}

                                                
                                                
-- /stdout --
json_output_test.go:65: failed to clean up: args "out/minikube-linux-arm64 unpause -p json-output-368600 --output=json --user=testUser": exit status 80
--- FAIL: TestJSONOutput/unpause/Command (2.10s)

                                                
                                    
x
+
TestKubernetesUpgrade (796.4s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-arm64 start -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio: (43.702361283s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-arm64 stop -p kubernetes-upgrade-823559
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-arm64 stop -p kubernetes-upgrade-823559: (1.516427213s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-823559 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-arm64 -p kubernetes-upgrade-823559 status --format={{.Host}}: exit status 7 (230.532417ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
version_upgrade_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio: exit status 109 (12m25.654862709s)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-823559] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "kubernetes-upgrade-823559" primary control-plane node in "kubernetes-upgrade-823559" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 01:27:30.540756 1340150 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:27:30.540864 1340150 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:27:30.540870 1340150 out.go:374] Setting ErrFile to fd 2...
	I1218 01:27:30.540881 1340150 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:27:30.541268 1340150 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:27:30.541707 1340150 out.go:368] Setting JSON to false
	I1218 01:27:30.544697 1340150 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":29399,"bootTime":1765991852,"procs":171,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 01:27:30.544787 1340150 start.go:143] virtualization:  
	I1218 01:27:30.549002 1340150 out.go:179] * [kubernetes-upgrade-823559] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 01:27:30.552408 1340150 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 01:27:30.552898 1340150 notify.go:221] Checking for updates...
	I1218 01:27:30.558229 1340150 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 01:27:30.561042 1340150 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:27:30.563928 1340150 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 01:27:30.566757 1340150 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 01:27:30.569825 1340150 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 01:27:30.573421 1340150 config.go:182] Loaded profile config "kubernetes-upgrade-823559": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.28.0
	I1218 01:27:30.574049 1340150 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 01:27:30.609239 1340150 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 01:27:30.609373 1340150 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:27:30.724062 1340150 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:54 SystemTime:2025-12-18 01:27:30.711509315 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:27:30.724166 1340150 docker.go:319] overlay module found
	I1218 01:27:30.727646 1340150 out.go:179] * Using the docker driver based on existing profile
	I1218 01:27:30.730658 1340150 start.go:309] selected driver: docker
	I1218 01:27:30.730689 1340150 start.go:927] validating driver "docker" against &{Name:kubernetes-upgrade-823559 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:kubernetes-upgrade-823559 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirm
warePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:27:30.730789 1340150 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 01:27:30.731487 1340150 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:27:30.848691 1340150 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:54 SystemTime:2025-12-18 01:27:30.836435814 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:27:30.848997 1340150 cni.go:84] Creating CNI manager for ""
	I1218 01:27:30.849055 1340150 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:27:30.849093 1340150 start.go:353] cluster config:
	{Name:kubernetes-upgrade-823559 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-823559 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:c
luster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSoc
k: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:27:30.854672 1340150 out.go:179] * Starting "kubernetes-upgrade-823559" primary control-plane node in "kubernetes-upgrade-823559" cluster
	I1218 01:27:30.857620 1340150 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 01:27:30.861809 1340150 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 01:27:30.866852 1340150 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 01:27:30.866899 1340150 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 01:27:30.866908 1340150 cache.go:65] Caching tarball of preloaded images
	I1218 01:27:30.866991 1340150 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 01:27:30.867005 1340150 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 01:27:30.867189 1340150 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 01:27:30.867341 1340150 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/config.json ...
	I1218 01:27:30.908491 1340150 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 01:27:30.908511 1340150 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 01:27:30.908528 1340150 cache.go:243] Successfully downloaded all kic artifacts
	I1218 01:27:30.908556 1340150 start.go:360] acquireMachinesLock for kubernetes-upgrade-823559: {Name:mke567debe2a63cf6629028df24860ce35223da1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 01:27:30.908607 1340150 start.go:364] duration metric: took 34.509µs to acquireMachinesLock for "kubernetes-upgrade-823559"
	I1218 01:27:30.908626 1340150 start.go:96] Skipping create...Using existing machine configuration
	I1218 01:27:30.908631 1340150 fix.go:54] fixHost starting: 
	I1218 01:27:30.908890 1340150 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-823559 --format={{.State.Status}}
	I1218 01:27:30.949297 1340150 fix.go:112] recreateIfNeeded on kubernetes-upgrade-823559: state=Stopped err=<nil>
	W1218 01:27:30.949331 1340150 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 01:27:30.952444 1340150 out.go:252] * Restarting existing docker container for "kubernetes-upgrade-823559" ...
	I1218 01:27:30.952523 1340150 cli_runner.go:164] Run: docker start kubernetes-upgrade-823559
	I1218 01:27:31.263208 1340150 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-823559 --format={{.State.Status}}
	I1218 01:27:31.286400 1340150 kic.go:430] container "kubernetes-upgrade-823559" state is running.
	I1218 01:27:31.286789 1340150 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-823559
	I1218 01:27:31.311018 1340150 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/config.json ...
	I1218 01:27:31.311259 1340150 machine.go:94] provisionDockerMachine start ...
	I1218 01:27:31.311318 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:31.334609 1340150 main.go:143] libmachine: Using SSH client type: native
	I1218 01:27:31.334944 1340150 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34155 <nil> <nil>}
	I1218 01:27:31.334953 1340150 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 01:27:31.336499 1340150 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1218 01:27:34.504711 1340150 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-823559
	
	I1218 01:27:34.504788 1340150 ubuntu.go:182] provisioning hostname "kubernetes-upgrade-823559"
	I1218 01:27:34.504896 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:34.529877 1340150 main.go:143] libmachine: Using SSH client type: native
	I1218 01:27:34.530186 1340150 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34155 <nil> <nil>}
	I1218 01:27:34.530197 1340150 main.go:143] libmachine: About to run SSH command:
	sudo hostname kubernetes-upgrade-823559 && echo "kubernetes-upgrade-823559" | sudo tee /etc/hostname
	I1218 01:27:34.724744 1340150 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-823559
	
	I1218 01:27:34.724823 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:34.761098 1340150 main.go:143] libmachine: Using SSH client type: native
	I1218 01:27:34.761408 1340150 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34155 <nil> <nil>}
	I1218 01:27:34.761440 1340150 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\skubernetes-upgrade-823559' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 kubernetes-upgrade-823559/g' /etc/hosts;
				else 
					echo '127.0.1.1 kubernetes-upgrade-823559' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 01:27:34.920650 1340150 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 01:27:34.920679 1340150 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 01:27:34.920716 1340150 ubuntu.go:190] setting up certificates
	I1218 01:27:34.920731 1340150 provision.go:84] configureAuth start
	I1218 01:27:34.920797 1340150 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-823559
	I1218 01:27:34.942835 1340150 provision.go:143] copyHostCerts
	I1218 01:27:34.942909 1340150 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 01:27:34.942918 1340150 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 01:27:34.942990 1340150 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 01:27:34.943078 1340150 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 01:27:34.943083 1340150 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 01:27:34.943107 1340150 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 01:27:34.943172 1340150 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 01:27:34.943178 1340150 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 01:27:34.943203 1340150 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 01:27:34.943251 1340150 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.kubernetes-upgrade-823559 san=[127.0.0.1 192.168.76.2 kubernetes-upgrade-823559 localhost minikube]
	I1218 01:27:35.225316 1340150 provision.go:177] copyRemoteCerts
	I1218 01:27:35.225390 1340150 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 01:27:35.225436 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:35.243539 1340150 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34155 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/kubernetes-upgrade-823559/id_rsa Username:docker}
	I1218 01:27:35.349571 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1218 01:27:35.371938 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 01:27:35.406532 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1218 01:27:35.447393 1340150 provision.go:87] duration metric: took 526.640921ms to configureAuth
	I1218 01:27:35.447417 1340150 ubuntu.go:206] setting minikube options for container-runtime
	I1218 01:27:35.447603 1340150 config.go:182] Loaded profile config "kubernetes-upgrade-823559": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 01:27:35.447719 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:35.495518 1340150 main.go:143] libmachine: Using SSH client type: native
	I1218 01:27:35.495868 1340150 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34155 <nil> <nil>}
	I1218 01:27:35.495884 1340150 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 01:27:35.869645 1340150 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 01:27:35.869673 1340150 machine.go:97] duration metric: took 4.558404292s to provisionDockerMachine
	I1218 01:27:35.869685 1340150 start.go:293] postStartSetup for "kubernetes-upgrade-823559" (driver="docker")
	I1218 01:27:35.869697 1340150 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 01:27:35.869763 1340150 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 01:27:35.869832 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:35.891706 1340150 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34155 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/kubernetes-upgrade-823559/id_rsa Username:docker}
	I1218 01:27:36.000320 1340150 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 01:27:36.005814 1340150 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 01:27:36.005843 1340150 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 01:27:36.005855 1340150 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 01:27:36.005921 1340150 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 01:27:36.006001 1340150 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 01:27:36.006106 1340150 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1218 01:27:36.019308 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:27:36.044320 1340150 start.go:296] duration metric: took 174.618464ms for postStartSetup
	I1218 01:27:36.044404 1340150 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:27:36.044461 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:36.065851 1340150 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34155 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/kubernetes-upgrade-823559/id_rsa Username:docker}
	I1218 01:27:36.175285 1340150 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 01:27:36.181283 1340150 fix.go:56] duration metric: took 5.27263592s for fixHost
	I1218 01:27:36.181313 1340150 start.go:83] releasing machines lock for "kubernetes-upgrade-823559", held for 5.272696784s
	I1218 01:27:36.181383 1340150 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-823559
	I1218 01:27:36.207593 1340150 ssh_runner.go:195] Run: cat /version.json
	I1218 01:27:36.207640 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:36.208114 1340150 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 01:27:36.208171 1340150 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-823559
	I1218 01:27:36.241931 1340150 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34155 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/kubernetes-upgrade-823559/id_rsa Username:docker}
	I1218 01:27:36.245266 1340150 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34155 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/kubernetes-upgrade-823559/id_rsa Username:docker}
	I1218 01:27:36.356692 1340150 ssh_runner.go:195] Run: systemctl --version
	I1218 01:27:36.469685 1340150 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 01:27:36.535163 1340150 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 01:27:36.540098 1340150 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 01:27:36.540175 1340150 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 01:27:36.549939 1340150 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 01:27:36.549971 1340150 start.go:496] detecting cgroup driver to use...
	I1218 01:27:36.550014 1340150 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 01:27:36.550088 1340150 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 01:27:36.566492 1340150 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 01:27:36.581232 1340150 docker.go:218] disabling cri-docker service (if available) ...
	I1218 01:27:36.581314 1340150 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 01:27:36.597972 1340150 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 01:27:36.616041 1340150 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 01:27:36.770530 1340150 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 01:27:36.922706 1340150 docker.go:234] disabling docker service ...
	I1218 01:27:36.922790 1340150 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 01:27:36.939492 1340150 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 01:27:36.954802 1340150 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 01:27:37.116079 1340150 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 01:27:37.256817 1340150 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 01:27:37.270805 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 01:27:37.286838 1340150 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 01:27:37.286910 1340150 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.295666 1340150 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 01:27:37.295740 1340150 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.304467 1340150 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.313184 1340150 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.321768 1340150 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 01:27:37.329874 1340150 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.338898 1340150 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.347297 1340150 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:27:37.355896 1340150 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 01:27:37.363709 1340150 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 01:27:37.371300 1340150 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:27:37.514578 1340150 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 01:27:37.704480 1340150 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 01:27:37.704598 1340150 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 01:27:37.708778 1340150 start.go:564] Will wait 60s for crictl version
	I1218 01:27:37.708893 1340150 ssh_runner.go:195] Run: which crictl
	I1218 01:27:37.712443 1340150 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 01:27:37.748446 1340150 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 01:27:37.748557 1340150 ssh_runner.go:195] Run: crio --version
	I1218 01:27:37.806741 1340150 ssh_runner.go:195] Run: crio --version
	I1218 01:27:37.839800 1340150 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on CRI-O 1.34.3 ...
	I1218 01:27:37.842732 1340150 cli_runner.go:164] Run: docker network inspect kubernetes-upgrade-823559 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 01:27:37.857659 1340150 ssh_runner.go:195] Run: grep 192.168.76.1	host.minikube.internal$ /etc/hosts
	I1218 01:27:37.861917 1340150 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.76.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 01:27:37.871241 1340150 kubeadm.go:884] updating cluster {Name:kubernetes-upgrade-823559 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-823559 Namespace:default APIServerHAVIP: APIServ
erName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePat
h: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 01:27:37.871356 1340150 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 01:27:37.871408 1340150 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 01:27:37.905694 1340150 crio.go:510] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-rc.1". assuming images are not preloaded.
	I1218 01:27:37.905771 1340150 ssh_runner.go:195] Run: which lz4
	I1218 01:27:37.910190 1340150 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I1218 01:27:37.914789 1340150 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I1218 01:27:37.914867 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 --> /preloaded.tar.lz4 (306154261 bytes)
	I1218 01:27:41.176094 1340150 crio.go:462] duration metric: took 3.26595783s to copy over tarball
	I1218 01:27:41.176170 1340150 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I1218 01:27:43.328121 1340150 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.15191492s)
	I1218 01:27:43.328152 1340150 crio.go:469] duration metric: took 2.15202841s to extract the tarball
	I1218 01:27:43.328160 1340150 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I1218 01:27:43.373239 1340150 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 01:27:43.413319 1340150 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 01:27:43.413345 1340150 cache_images.go:86] Images are preloaded, skipping loading
	I1218 01:27:43.413353 1340150 kubeadm.go:935] updating node { 192.168.76.2 8443 v1.35.0-rc.1 crio true true} ...
	I1218 01:27:43.413455 1340150 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=kubernetes-upgrade-823559 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.76.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-823559 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 01:27:43.413540 1340150 ssh_runner.go:195] Run: crio config
	I1218 01:27:43.515416 1340150 cni.go:84] Creating CNI manager for ""
	I1218 01:27:43.515453 1340150 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:27:43.515471 1340150 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 01:27:43.515519 1340150 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.76.2 APIServerPort:8443 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:kubernetes-upgrade-823559 NodeName:kubernetes-upgrade-823559 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.76.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.76.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.c
rt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 01:27:43.515702 1340150 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.76.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "kubernetes-upgrade-823559"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.76.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 01:27:43.515797 1340150 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1218 01:27:43.523889 1340150 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 01:27:43.523992 1340150 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 01:27:43.531470 1340150 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (380 bytes)
	I1218 01:27:43.544105 1340150 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1218 01:27:43.556316 1340150 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2227 bytes)
	I1218 01:27:43.569935 1340150 ssh_runner.go:195] Run: grep 192.168.76.2	control-plane.minikube.internal$ /etc/hosts
	I1218 01:27:43.574099 1340150 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.76.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1218 01:27:43.584018 1340150 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:27:43.773880 1340150 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 01:27:43.807377 1340150 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559 for IP: 192.168.76.2
	I1218 01:27:43.807438 1340150 certs.go:195] generating shared ca certs ...
	I1218 01:27:43.807476 1340150 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:27:43.807670 1340150 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 01:27:43.807776 1340150 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 01:27:43.807805 1340150 certs.go:257] generating profile certs ...
	I1218 01:27:43.807946 1340150 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/client.key
	I1218 01:27:43.808068 1340150 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/apiserver.key.3d92c9a0
	I1218 01:27:43.808148 1340150 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/proxy-client.key
	I1218 01:27:43.808346 1340150 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 01:27:43.808415 1340150 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 01:27:43.808439 1340150 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 01:27:43.808511 1340150 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 01:27:43.808572 1340150 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 01:27:43.808637 1340150 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 01:27:43.808727 1340150 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:27:43.809534 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 01:27:43.841691 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 01:27:43.859046 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 01:27:43.884436 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 01:27:43.922173 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1218 01:27:43.945829 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 01:27:43.967764 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 01:27:43.990072 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1218 01:27:44.017089 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 01:27:44.036293 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 01:27:44.059483 1340150 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 01:27:44.090344 1340150 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 01:27:44.104919 1340150 ssh_runner.go:195] Run: openssl version
	I1218 01:27:44.111497 1340150 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:27:44.119251 1340150 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 01:27:44.127090 1340150 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:27:44.130987 1340150 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:27:44.131110 1340150 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:27:44.175916 1340150 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 01:27:44.184123 1340150 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 01:27:44.191636 1340150 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 01:27:44.199254 1340150 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 01:27:44.203156 1340150 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 01:27:44.203268 1340150 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 01:27:44.244945 1340150 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 01:27:44.253016 1340150 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 01:27:44.260733 1340150 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 01:27:44.268601 1340150 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 01:27:44.273260 1340150 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 01:27:44.273340 1340150 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 01:27:44.314374 1340150 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 01:27:44.322263 1340150 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 01:27:44.326337 1340150 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 01:27:44.368423 1340150 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 01:27:44.409403 1340150 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 01:27:44.450431 1340150 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 01:27:44.491981 1340150 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 01:27:44.533039 1340150 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 01:27:44.574522 1340150 kubeadm.go:401] StartCluster: {Name:kubernetes-upgrade-823559 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-823559 Namespace:default APIServerHAVIP: APIServerN
ame:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath:
SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:27:44.574624 1340150 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 01:27:44.574694 1340150 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 01:27:44.601549 1340150 cri.go:89] found id: ""
	I1218 01:27:44.601627 1340150 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 01:27:44.609998 1340150 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 01:27:44.610019 1340150 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 01:27:44.610083 1340150 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 01:27:44.617794 1340150 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 01:27:44.618242 1340150 kubeconfig.go:47] verify endpoint returned: get endpoint: "kubernetes-upgrade-823559" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:27:44.618354 1340150 kubeconfig.go:62] /home/jenkins/minikube-integration/22186-1156339/kubeconfig needs updating (will repair): [kubeconfig missing "kubernetes-upgrade-823559" cluster setting kubeconfig missing "kubernetes-upgrade-823559" context setting]
	I1218 01:27:44.618656 1340150 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:27:44.619256 1340150 kapi.go:59] client config for kubernetes-upgrade-823559: &rest.Config{Host:"https://192.168.76.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/kubernetes-upgrade-823559/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8
(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 01:27:44.619904 1340150 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 01:27:44.619926 1340150 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 01:27:44.619932 1340150 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 01:27:44.619936 1340150 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 01:27:44.619944 1340150 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 01:27:44.620663 1340150 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 01:27:44.633630 1340150 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-18 01:27:01.639278351 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-18 01:27:43.566078302 +0000
	@@ -1,4 +1,4 @@
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: InitConfiguration
	 localAPIEndpoint:
	   advertiseAddress: 192.168.76.2
	@@ -14,31 +14,34 @@
	   criSocket: unix:///var/run/crio/crio.sock
	   name: "kubernetes-upgrade-823559"
	   kubeletExtraArgs:
	-    node-ip: 192.168.76.2
	+    - name: "node-ip"
	+      value: "192.168.76.2"
	   taints: []
	 ---
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: ClusterConfiguration
	 apiServer:
	   certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	   extraArgs:
	-    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+    - name: "enable-admission-plugins"
	+      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	 controllerManager:
	   extraArgs:
	-    allocate-node-cidrs: "true"
	-    leader-elect: "false"
	+    - name: "allocate-node-cidrs"
	+      value: "true"
	+    - name: "leader-elect"
	+      value: "false"
	 scheduler:
	   extraArgs:
	-    leader-elect: "false"
	+    - name: "leader-elect"
	+      value: "false"
	 certificatesDir: /var/lib/minikube/certs
	 clusterName: mk
	 controlPlaneEndpoint: control-plane.minikube.internal:8443
	 etcd:
	   local:
	     dataDir: /var/lib/minikube/etcd
	-    extraArgs:
	-      proxy-refresh-interval: "70000"
	-kubernetesVersion: v1.28.0
	+kubernetesVersion: v1.35.0-rc.1
	 networking:
	   dnsDomain: cluster.local
	   podSubnet: "10.244.0.0/16"
	
	-- /stdout --
	I1218 01:27:44.633661 1340150 kubeadm.go:1161] stopping kube-system containers ...
	I1218 01:27:44.633673 1340150 cri.go:54] listing CRI containers in root : {State:all Name: Namespaces:[kube-system]}
	I1218 01:27:44.633737 1340150 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 01:27:44.694417 1340150 cri.go:89] found id: ""
	I1218 01:27:44.694516 1340150 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1218 01:27:44.729711 1340150 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 01:27:44.738308 1340150 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5639 Dec 18 01:27 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5656 Dec 18 01:27 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Dec 18 01:27 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5600 Dec 18 01:27 /etc/kubernetes/scheduler.conf
	
	I1218 01:27:44.738384 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1218 01:27:44.746934 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1218 01:27:44.755227 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1218 01:27:44.763031 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 01:27:44.763101 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 01:27:44.770912 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1218 01:27:44.778270 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1218 01:27:44.778383 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 01:27:44.786098 1340150 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 01:27:44.794086 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 01:27:44.858588 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 01:27:46.779501 1340150 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.920862879s)
	I1218 01:27:46.779577 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1218 01:27:47.086107 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1218 01:27:47.217407 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1218 01:27:47.291626 1340150 api_server.go:52] waiting for apiserver process to appear ...
	I1218 01:27:47.291765 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:47.791931 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:48.292810 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:48.791899 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:49.291891 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:49.792282 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:50.291913 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:50.792425 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:51.292297 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:51.791880 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:52.292505 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:52.791824 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:53.291863 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:53.791900 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:54.291909 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:54.791836 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:55.292766 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:55.791873 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:56.291870 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:56.792450 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:57.292087 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:57.792269 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:58.293005 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:58.791826 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:59.292741 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:27:59.792637 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:00.292286 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:00.792109 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:01.292794 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:01.792119 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:02.291887 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:02.791906 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:03.291899 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:03.792654 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:04.292821 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:04.792311 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:05.291873 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:05.792329 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:06.291929 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:06.791927 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:07.292425 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:07.792839 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:08.292522 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:08.792483 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:09.292485 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:09.792389 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:10.292659 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:10.792573 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:11.291888 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:11.791891 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:12.291875 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:12.792388 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:13.292654 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:13.791882 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:14.292027 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:14.791830 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:15.292475 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:15.792242 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:16.292799 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:16.791913 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:17.292186 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:17.792851 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:18.292797 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:18.791951 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:19.291863 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:19.792484 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:20.291912 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:20.792694 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:21.291898 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:21.792355 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:22.292786 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:22.792264 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:23.292693 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:23.791925 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:24.292671 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:24.792479 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:25.291895 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:25.792491 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:26.292066 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:26.792811 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:27.292562 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:27.791910 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:28.292730 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:28.791942 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:29.291924 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:29.791970 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:30.291934 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:30.792358 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:31.291884 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:31.792512 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:32.292185 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:32.791842 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:33.291898 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:33.792633 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:34.292470 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:34.792818 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:35.292305 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:35.792319 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:36.292543 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:36.791903 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:37.292011 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:37.792498 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:38.292045 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:38.791921 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:39.292148 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:39.792601 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:40.291893 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:40.792501 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:41.291910 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:41.791894 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:42.292198 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:42.792805 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:43.291911 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:43.792328 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:44.292700 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:44.791950 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:45.292725 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:45.792693 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:46.292742 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:46.792780 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:47.291907 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:28:47.292003 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:28:47.317272 1340150 cri.go:89] found id: ""
	I1218 01:28:47.317298 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.317308 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:28:47.317316 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:28:47.317377 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:28:47.343508 1340150 cri.go:89] found id: ""
	I1218 01:28:47.343533 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.343542 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:28:47.343548 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:28:47.343607 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:28:47.368403 1340150 cri.go:89] found id: ""
	I1218 01:28:47.368429 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.368438 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:28:47.368445 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:28:47.368503 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:28:47.394117 1340150 cri.go:89] found id: ""
	I1218 01:28:47.394143 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.394152 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:28:47.394159 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:28:47.394217 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:28:47.422187 1340150 cri.go:89] found id: ""
	I1218 01:28:47.422209 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.422218 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:28:47.422224 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:28:47.422287 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:28:47.446710 1340150 cri.go:89] found id: ""
	I1218 01:28:47.446732 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.446741 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:28:47.446747 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:28:47.446802 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:28:47.471288 1340150 cri.go:89] found id: ""
	I1218 01:28:47.471312 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.471320 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:28:47.471327 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:28:47.471388 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:28:47.496887 1340150 cri.go:89] found id: ""
	I1218 01:28:47.496908 1340150 logs.go:282] 0 containers: []
	W1218 01:28:47.496917 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:28:47.496925 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:28:47.496938 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:28:47.568550 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:28:47.568597 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:28:47.586477 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:28:47.586502 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:28:47.887485 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:28:47.887503 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:28:47.887515 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:28:47.922223 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:28:47.922253 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:28:50.458854 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:50.468758 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:28:50.468827 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:28:50.494550 1340150 cri.go:89] found id: ""
	I1218 01:28:50.494575 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.494584 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:28:50.494590 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:28:50.494646 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:28:50.523412 1340150 cri.go:89] found id: ""
	I1218 01:28:50.523436 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.523445 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:28:50.523451 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:28:50.523512 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:28:50.553111 1340150 cri.go:89] found id: ""
	I1218 01:28:50.553133 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.553142 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:28:50.553148 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:28:50.553202 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:28:50.578195 1340150 cri.go:89] found id: ""
	I1218 01:28:50.578219 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.578227 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:28:50.578233 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:28:50.578293 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:28:50.603565 1340150 cri.go:89] found id: ""
	I1218 01:28:50.603590 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.603598 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:28:50.603605 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:28:50.603662 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:28:50.628613 1340150 cri.go:89] found id: ""
	I1218 01:28:50.628635 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.628644 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:28:50.628650 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:28:50.628709 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:28:50.653558 1340150 cri.go:89] found id: ""
	I1218 01:28:50.653579 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.653588 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:28:50.653593 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:28:50.653654 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:28:50.677072 1340150 cri.go:89] found id: ""
	I1218 01:28:50.677096 1340150 logs.go:282] 0 containers: []
	W1218 01:28:50.677105 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:28:50.677114 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:28:50.677125 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:28:50.744528 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:28:50.744566 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:28:50.760240 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:28:50.760269 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:28:50.821689 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:28:50.821713 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:28:50.821727 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:28:50.853370 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:28:50.853402 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:28:53.380338 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:53.390246 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:28:53.390309 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:28:53.415748 1340150 cri.go:89] found id: ""
	I1218 01:28:53.415769 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.415776 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:28:53.415782 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:28:53.415855 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:28:53.450652 1340150 cri.go:89] found id: ""
	I1218 01:28:53.450671 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.450679 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:28:53.450685 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:28:53.450740 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:28:53.477467 1340150 cri.go:89] found id: ""
	I1218 01:28:53.477488 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.477496 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:28:53.477502 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:28:53.477556 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:28:53.504338 1340150 cri.go:89] found id: ""
	I1218 01:28:53.504358 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.504366 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:28:53.504372 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:28:53.504429 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:28:53.538283 1340150 cri.go:89] found id: ""
	I1218 01:28:53.538303 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.538311 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:28:53.538319 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:28:53.538374 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:28:53.565369 1340150 cri.go:89] found id: ""
	I1218 01:28:53.565390 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.565399 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:28:53.565405 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:28:53.565460 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:28:53.595966 1340150 cri.go:89] found id: ""
	I1218 01:28:53.595987 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.595996 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:28:53.596002 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:28:53.596056 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:28:53.629670 1340150 cri.go:89] found id: ""
	I1218 01:28:53.629690 1340150 logs.go:282] 0 containers: []
	W1218 01:28:53.629699 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:28:53.629708 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:28:53.629719 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:28:53.663382 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:28:53.663457 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:28:53.704722 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:28:53.704750 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:28:53.788987 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:28:53.789025 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:28:53.808575 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:28:53.808606 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:28:53.908566 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:28:56.408795 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:56.419075 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:28:56.419144 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:28:56.444961 1340150 cri.go:89] found id: ""
	I1218 01:28:56.444986 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.444994 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:28:56.445001 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:28:56.445057 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:28:56.472138 1340150 cri.go:89] found id: ""
	I1218 01:28:56.472170 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.472180 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:28:56.472189 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:28:56.472273 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:28:56.498861 1340150 cri.go:89] found id: ""
	I1218 01:28:56.498887 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.498895 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:28:56.498902 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:28:56.498994 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:28:56.525674 1340150 cri.go:89] found id: ""
	I1218 01:28:56.525711 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.525720 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:28:56.525726 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:28:56.525788 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:28:56.550210 1340150 cri.go:89] found id: ""
	I1218 01:28:56.550274 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.550295 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:28:56.550315 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:28:56.550402 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:28:56.575686 1340150 cri.go:89] found id: ""
	I1218 01:28:56.575719 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.575727 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:28:56.575734 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:28:56.575798 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:28:56.601077 1340150 cri.go:89] found id: ""
	I1218 01:28:56.601145 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.601169 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:28:56.601190 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:28:56.601261 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:28:56.629650 1340150 cri.go:89] found id: ""
	I1218 01:28:56.629674 1340150 logs.go:282] 0 containers: []
	W1218 01:28:56.629683 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:28:56.629692 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:28:56.629703 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:28:56.697091 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:28:56.697126 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:28:56.713334 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:28:56.713406 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:28:56.783802 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:28:56.783827 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:28:56.783840 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:28:56.814413 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:28:56.814446 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:28:59.346932 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:28:59.357472 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:28:59.357550 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:28:59.386500 1340150 cri.go:89] found id: ""
	I1218 01:28:59.386522 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.386530 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:28:59.386536 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:28:59.386593 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:28:59.413421 1340150 cri.go:89] found id: ""
	I1218 01:28:59.413498 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.413531 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:28:59.413573 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:28:59.413666 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:28:59.442400 1340150 cri.go:89] found id: ""
	I1218 01:28:59.442424 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.442433 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:28:59.442438 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:28:59.442496 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:28:59.467535 1340150 cri.go:89] found id: ""
	I1218 01:28:59.467560 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.467569 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:28:59.467575 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:28:59.467641 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:28:59.492736 1340150 cri.go:89] found id: ""
	I1218 01:28:59.492761 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.492769 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:28:59.492777 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:28:59.492858 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:28:59.520030 1340150 cri.go:89] found id: ""
	I1218 01:28:59.520063 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.520072 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:28:59.520079 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:28:59.520144 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:28:59.545555 1340150 cri.go:89] found id: ""
	I1218 01:28:59.545578 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.545587 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:28:59.545594 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:28:59.545652 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:28:59.572732 1340150 cri.go:89] found id: ""
	I1218 01:28:59.572755 1340150 logs.go:282] 0 containers: []
	W1218 01:28:59.572763 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:28:59.572811 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:28:59.572839 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:28:59.641335 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:28:59.641374 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:28:59.659089 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:28:59.659168 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:28:59.725549 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:28:59.725568 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:28:59.725580 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:28:59.756597 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:28:59.756631 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:02.286075 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:02.296894 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:02.296964 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:02.327226 1340150 cri.go:89] found id: ""
	I1218 01:29:02.327250 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.327259 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:02.327275 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:02.327337 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:02.359522 1340150 cri.go:89] found id: ""
	I1218 01:29:02.359554 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.359563 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:02.359570 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:02.359640 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:02.388592 1340150 cri.go:89] found id: ""
	I1218 01:29:02.388619 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.388628 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:02.388634 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:02.388692 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:02.414653 1340150 cri.go:89] found id: ""
	I1218 01:29:02.414722 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.414749 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:02.414767 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:02.414854 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:02.443021 1340150 cri.go:89] found id: ""
	I1218 01:29:02.443093 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.443116 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:02.443136 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:02.443225 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:02.471245 1340150 cri.go:89] found id: ""
	I1218 01:29:02.471325 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.471347 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:02.471367 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:02.471453 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:02.498729 1340150 cri.go:89] found id: ""
	I1218 01:29:02.498802 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.498823 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:02.498841 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:02.498928 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:02.529176 1340150 cri.go:89] found id: ""
	I1218 01:29:02.529250 1340150 logs.go:282] 0 containers: []
	W1218 01:29:02.529272 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:02.529293 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:02.529331 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:02.596822 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:02.596858 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:02.612636 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:02.612664 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:02.678127 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:02.678149 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:02.678166 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:02.708955 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:02.708987 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:05.237443 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:05.247278 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:05.247385 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:05.274117 1340150 cri.go:89] found id: ""
	I1218 01:29:05.274183 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.274205 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:05.274218 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:05.274292 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:05.297781 1340150 cri.go:89] found id: ""
	I1218 01:29:05.297807 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.297815 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:05.297845 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:05.297911 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:05.324424 1340150 cri.go:89] found id: ""
	I1218 01:29:05.324496 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.324527 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:05.324545 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:05.324638 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:05.349300 1340150 cri.go:89] found id: ""
	I1218 01:29:05.349323 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.349331 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:05.349337 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:05.349403 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:05.377728 1340150 cri.go:89] found id: ""
	I1218 01:29:05.377795 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.377818 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:05.377833 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:05.377914 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:05.403164 1340150 cri.go:89] found id: ""
	I1218 01:29:05.403196 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.403206 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:05.403212 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:05.403281 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:05.431478 1340150 cri.go:89] found id: ""
	I1218 01:29:05.431548 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.431570 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:05.431583 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:05.431654 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:05.460342 1340150 cri.go:89] found id: ""
	I1218 01:29:05.460418 1340150 logs.go:282] 0 containers: []
	W1218 01:29:05.460441 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:05.460463 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:05.460487 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:05.524322 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:05.524387 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:05.524416 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:05.555061 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:05.555095 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:05.581589 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:05.581619 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:05.653718 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:05.653759 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:08.170549 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:08.181788 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:08.181855 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:08.214042 1340150 cri.go:89] found id: ""
	I1218 01:29:08.214069 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.214078 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:08.214084 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:08.214334 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:08.239049 1340150 cri.go:89] found id: ""
	I1218 01:29:08.239071 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.239079 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:08.239084 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:08.239139 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:08.263526 1340150 cri.go:89] found id: ""
	I1218 01:29:08.263551 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.263560 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:08.263570 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:08.263627 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:08.287846 1340150 cri.go:89] found id: ""
	I1218 01:29:08.287909 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.287932 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:08.287950 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:08.288035 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:08.314560 1340150 cri.go:89] found id: ""
	I1218 01:29:08.314771 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.314800 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:08.314819 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:08.314915 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:08.339203 1340150 cri.go:89] found id: ""
	I1218 01:29:08.339278 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.339302 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:08.339358 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:08.339439 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:08.364846 1340150 cri.go:89] found id: ""
	I1218 01:29:08.364870 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.364879 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:08.364885 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:08.364951 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:08.393676 1340150 cri.go:89] found id: ""
	I1218 01:29:08.393698 1340150 logs.go:282] 0 containers: []
	W1218 01:29:08.393707 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:08.393717 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:08.393731 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:08.455225 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:08.455244 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:08.455256 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:08.485640 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:08.485672 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:08.515308 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:08.515384 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:08.587964 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:08.588001 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:11.104405 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:11.115410 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:11.115487 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:11.161415 1340150 cri.go:89] found id: ""
	I1218 01:29:11.161443 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.161453 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:11.161462 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:11.161542 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:11.237575 1340150 cri.go:89] found id: ""
	I1218 01:29:11.237602 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.237610 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:11.237617 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:11.237674 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:11.266940 1340150 cri.go:89] found id: ""
	I1218 01:29:11.266966 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.266975 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:11.266981 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:11.267043 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:11.306591 1340150 cri.go:89] found id: ""
	I1218 01:29:11.306624 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.306633 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:11.306638 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:11.306697 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:11.345300 1340150 cri.go:89] found id: ""
	I1218 01:29:11.345327 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.345339 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:11.345345 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:11.345402 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:11.378592 1340150 cri.go:89] found id: ""
	I1218 01:29:11.378619 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.378628 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:11.378634 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:11.378697 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:11.414510 1340150 cri.go:89] found id: ""
	I1218 01:29:11.414542 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.414551 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:11.414557 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:11.414623 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:11.442624 1340150 cri.go:89] found id: ""
	I1218 01:29:11.442651 1340150 logs.go:282] 0 containers: []
	W1218 01:29:11.442659 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:11.442668 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:11.442680 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:11.474954 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:11.474983 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:11.546852 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:11.546930 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:11.562667 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:11.562690 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:11.642858 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:11.642927 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:11.642952 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:14.174427 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:14.185363 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:14.185435 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:14.212643 1340150 cri.go:89] found id: ""
	I1218 01:29:14.212665 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.212674 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:14.212681 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:14.212739 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:14.237554 1340150 cri.go:89] found id: ""
	I1218 01:29:14.237577 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.237585 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:14.237591 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:14.237647 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:14.263410 1340150 cri.go:89] found id: ""
	I1218 01:29:14.263432 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.263440 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:14.263446 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:14.263516 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:14.289292 1340150 cri.go:89] found id: ""
	I1218 01:29:14.289316 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.289324 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:14.289330 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:14.289388 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:14.315733 1340150 cri.go:89] found id: ""
	I1218 01:29:14.315754 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.315763 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:14.315769 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:14.315828 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:14.340685 1340150 cri.go:89] found id: ""
	I1218 01:29:14.340708 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.340717 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:14.340723 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:14.340783 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:14.365950 1340150 cri.go:89] found id: ""
	I1218 01:29:14.365972 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.365981 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:14.365986 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:14.366046 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:14.391195 1340150 cri.go:89] found id: ""
	I1218 01:29:14.391216 1340150 logs.go:282] 0 containers: []
	W1218 01:29:14.391223 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:14.391232 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:14.391244 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:14.421789 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:14.421826 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:14.449720 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:14.449749 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:14.522262 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:14.522301 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:14.538532 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:14.538559 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:14.602755 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:17.103657 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:17.114325 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:17.114392 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:17.153716 1340150 cri.go:89] found id: ""
	I1218 01:29:17.153740 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.153755 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:17.153761 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:17.153818 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:17.192308 1340150 cri.go:89] found id: ""
	I1218 01:29:17.192332 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.192342 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:17.192348 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:17.192416 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:17.232505 1340150 cri.go:89] found id: ""
	I1218 01:29:17.232530 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.232539 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:17.232544 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:17.232606 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:17.258588 1340150 cri.go:89] found id: ""
	I1218 01:29:17.258614 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.258622 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:17.258628 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:17.258684 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:17.284685 1340150 cri.go:89] found id: ""
	I1218 01:29:17.284711 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.284720 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:17.284729 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:17.284787 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:17.310045 1340150 cri.go:89] found id: ""
	I1218 01:29:17.310070 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.310079 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:17.310087 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:17.310166 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:17.334885 1340150 cri.go:89] found id: ""
	I1218 01:29:17.334907 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.334916 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:17.334924 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:17.334980 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:17.360324 1340150 cri.go:89] found id: ""
	I1218 01:29:17.360346 1340150 logs.go:282] 0 containers: []
	W1218 01:29:17.360357 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:17.360365 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:17.360377 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:17.388070 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:17.388100 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:17.457045 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:17.457083 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:17.473327 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:17.473353 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:17.537508 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:17.537714 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:17.537737 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:20.068441 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:20.078638 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:20.078711 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:20.105744 1340150 cri.go:89] found id: ""
	I1218 01:29:20.105768 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.105776 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:20.105788 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:20.105862 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:20.135542 1340150 cri.go:89] found id: ""
	I1218 01:29:20.135565 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.135574 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:20.135581 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:20.135645 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:20.171126 1340150 cri.go:89] found id: ""
	I1218 01:29:20.171201 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.171225 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:20.171248 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:20.171388 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:20.206303 1340150 cri.go:89] found id: ""
	I1218 01:29:20.206332 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.206341 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:20.206348 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:20.206405 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:20.231826 1340150 cri.go:89] found id: ""
	I1218 01:29:20.231852 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.231861 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:20.231868 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:20.231924 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:20.256590 1340150 cri.go:89] found id: ""
	I1218 01:29:20.256612 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.256621 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:20.256627 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:20.256683 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:20.282179 1340150 cri.go:89] found id: ""
	I1218 01:29:20.282202 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.282211 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:20.282217 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:20.282278 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:20.310635 1340150 cri.go:89] found id: ""
	I1218 01:29:20.310660 1340150 logs.go:282] 0 containers: []
	W1218 01:29:20.310669 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:20.310679 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:20.310691 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:20.326290 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:20.326316 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:20.392610 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:20.392632 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:20.392644 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:20.424740 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:20.424771 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:20.453054 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:20.453081 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:23.021257 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:23.036761 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:23.036841 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:23.079002 1340150 cri.go:89] found id: ""
	I1218 01:29:23.079032 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.079043 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:23.079049 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:23.079120 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:23.122902 1340150 cri.go:89] found id: ""
	I1218 01:29:23.122923 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.122931 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:23.122938 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:23.122994 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:23.163439 1340150 cri.go:89] found id: ""
	I1218 01:29:23.163461 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.163470 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:23.163476 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:23.163541 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:23.209175 1340150 cri.go:89] found id: ""
	I1218 01:29:23.209198 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.209211 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:23.209218 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:23.209288 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:23.241585 1340150 cri.go:89] found id: ""
	I1218 01:29:23.241611 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.241623 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:23.241629 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:23.241695 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:23.284318 1340150 cri.go:89] found id: ""
	I1218 01:29:23.284398 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.284441 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:23.284480 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:23.284572 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:23.327656 1340150 cri.go:89] found id: ""
	I1218 01:29:23.327726 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.327761 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:23.327785 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:23.327893 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:23.360258 1340150 cri.go:89] found id: ""
	I1218 01:29:23.360333 1340150 logs.go:282] 0 containers: []
	W1218 01:29:23.360360 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:23.360382 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:23.360422 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:23.451490 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:23.451558 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:23.451584 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:23.487274 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:23.487311 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:23.517612 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:23.517688 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:23.594390 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:23.594469 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:26.111613 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:26.121539 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:26.121603 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:26.149395 1340150 cri.go:89] found id: ""
	I1218 01:29:26.149418 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.149426 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:26.149433 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:26.149492 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:26.173467 1340150 cri.go:89] found id: ""
	I1218 01:29:26.173489 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.173497 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:26.173504 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:26.173567 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:26.198418 1340150 cri.go:89] found id: ""
	I1218 01:29:26.198439 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.198448 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:26.198454 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:26.198517 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:26.226378 1340150 cri.go:89] found id: ""
	I1218 01:29:26.226440 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.226466 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:26.226485 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:26.226562 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:26.252138 1340150 cri.go:89] found id: ""
	I1218 01:29:26.252204 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.252253 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:26.252281 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:26.252349 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:26.278607 1340150 cri.go:89] found id: ""
	I1218 01:29:26.278632 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.278641 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:26.278647 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:26.278705 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:26.307068 1340150 cri.go:89] found id: ""
	I1218 01:29:26.307107 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.307117 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:26.307123 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:26.307199 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:26.337433 1340150 cri.go:89] found id: ""
	I1218 01:29:26.337459 1340150 logs.go:282] 0 containers: []
	W1218 01:29:26.337472 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:26.337482 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:26.337492 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:26.368755 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:26.368789 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:26.408745 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:26.408774 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:26.480327 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:26.480359 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:26.497551 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:26.497582 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:26.568806 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:29.069020 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:29.080661 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:29.080729 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:29.117699 1340150 cri.go:89] found id: ""
	I1218 01:29:29.117723 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.117733 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:29.117739 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:29.117798 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:29.158062 1340150 cri.go:89] found id: ""
	I1218 01:29:29.158082 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.158090 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:29.158096 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:29.158155 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:29.188424 1340150 cri.go:89] found id: ""
	I1218 01:29:29.188446 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.188454 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:29.188461 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:29.188530 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:29.213913 1340150 cri.go:89] found id: ""
	I1218 01:29:29.213935 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.213951 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:29.213957 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:29.214013 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:29.238953 1340150 cri.go:89] found id: ""
	I1218 01:29:29.238978 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.238987 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:29.238993 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:29.239051 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:29.262472 1340150 cri.go:89] found id: ""
	I1218 01:29:29.262497 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.262506 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:29.262512 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:29.262571 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:29.287209 1340150 cri.go:89] found id: ""
	I1218 01:29:29.287233 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.287241 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:29.287247 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:29.287310 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:29.313650 1340150 cri.go:89] found id: ""
	I1218 01:29:29.313674 1340150 logs.go:282] 0 containers: []
	W1218 01:29:29.313683 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:29.313693 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:29.313704 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:29.381904 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:29.381939 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:29.402174 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:29.402200 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:29.489260 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:29.489277 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:29.489289 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:29.519559 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:29.519594 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:32.050991 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:32.063562 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:32.063632 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:32.099679 1340150 cri.go:89] found id: ""
	I1218 01:29:32.099707 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.099715 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:32.099722 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:32.099781 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:32.134516 1340150 cri.go:89] found id: ""
	I1218 01:29:32.134543 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.134552 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:32.134558 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:32.134623 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:32.167224 1340150 cri.go:89] found id: ""
	I1218 01:29:32.167251 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.167259 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:32.167265 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:32.167330 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:32.214347 1340150 cri.go:89] found id: ""
	I1218 01:29:32.214375 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.214384 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:32.214391 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:32.214450 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:32.267052 1340150 cri.go:89] found id: ""
	I1218 01:29:32.267082 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.267091 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:32.267100 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:32.267165 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:32.313840 1340150 cri.go:89] found id: ""
	I1218 01:29:32.313872 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.313882 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:32.313895 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:32.313964 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:32.352321 1340150 cri.go:89] found id: ""
	I1218 01:29:32.352343 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.352350 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:32.352356 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:32.352432 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:32.387237 1340150 cri.go:89] found id: ""
	I1218 01:29:32.387260 1340150 logs.go:282] 0 containers: []
	W1218 01:29:32.387269 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:32.387278 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:32.387290 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:32.441740 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:32.441840 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:32.478924 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:32.479028 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:32.576513 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:32.576550 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:32.601573 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:32.601607 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:32.697915 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:35.198633 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:35.210071 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:35.210138 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:35.235715 1340150 cri.go:89] found id: ""
	I1218 01:29:35.235737 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.235745 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:35.235751 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:35.235817 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:35.263871 1340150 cri.go:89] found id: ""
	I1218 01:29:35.263893 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.263901 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:35.263907 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:35.263966 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:35.290188 1340150 cri.go:89] found id: ""
	I1218 01:29:35.290212 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.290221 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:35.290228 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:35.290295 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:35.320743 1340150 cri.go:89] found id: ""
	I1218 01:29:35.320767 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.320776 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:35.320783 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:35.320843 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:35.346097 1340150 cri.go:89] found id: ""
	I1218 01:29:35.346134 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.346143 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:35.346150 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:35.346221 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:35.374718 1340150 cri.go:89] found id: ""
	I1218 01:29:35.374747 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.374756 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:35.374764 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:35.374827 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:35.413192 1340150 cri.go:89] found id: ""
	I1218 01:29:35.413215 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.413224 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:35.413230 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:35.413286 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:35.442476 1340150 cri.go:89] found id: ""
	I1218 01:29:35.442498 1340150 logs.go:282] 0 containers: []
	W1218 01:29:35.442507 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:35.442516 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:35.442528 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:35.519646 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:35.519682 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:35.536181 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:35.536207 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:35.614858 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:35.614880 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:35.614893 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:35.649358 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:35.649392 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:38.200353 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:38.210208 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:38.210276 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:38.237560 1340150 cri.go:89] found id: ""
	I1218 01:29:38.237583 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.237591 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:38.237597 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:38.237656 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:38.262496 1340150 cri.go:89] found id: ""
	I1218 01:29:38.262519 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.262528 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:38.262534 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:38.262591 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:38.290835 1340150 cri.go:89] found id: ""
	I1218 01:29:38.290856 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.290864 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:38.290871 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:38.290941 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:38.316494 1340150 cri.go:89] found id: ""
	I1218 01:29:38.316514 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.316524 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:38.316530 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:38.316601 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:38.340470 1340150 cri.go:89] found id: ""
	I1218 01:29:38.340494 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.340508 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:38.340514 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:38.340577 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:38.364679 1340150 cri.go:89] found id: ""
	I1218 01:29:38.364702 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.364711 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:38.364717 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:38.364774 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:38.390687 1340150 cri.go:89] found id: ""
	I1218 01:29:38.390713 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.390722 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:38.390728 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:38.390964 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:38.426457 1340150 cri.go:89] found id: ""
	I1218 01:29:38.426493 1340150 logs.go:282] 0 containers: []
	W1218 01:29:38.426503 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:38.426512 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:38.426525 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:38.503016 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:38.503055 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:38.520097 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:38.520167 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:38.583031 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:38.583052 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:38.583100 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:38.614341 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:38.614377 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:41.142495 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:41.152170 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:41.152257 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:41.178701 1340150 cri.go:89] found id: ""
	I1218 01:29:41.178722 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.178731 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:41.178738 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:41.178797 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:41.203630 1340150 cri.go:89] found id: ""
	I1218 01:29:41.203657 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.203665 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:41.203671 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:41.203727 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:41.228400 1340150 cri.go:89] found id: ""
	I1218 01:29:41.228424 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.228433 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:41.228440 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:41.228500 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:41.252923 1340150 cri.go:89] found id: ""
	I1218 01:29:41.252945 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.252954 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:41.252960 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:41.253018 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:41.277616 1340150 cri.go:89] found id: ""
	I1218 01:29:41.277639 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.277648 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:41.277655 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:41.277712 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:41.303450 1340150 cri.go:89] found id: ""
	I1218 01:29:41.303473 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.303482 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:41.303491 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:41.303547 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:41.332123 1340150 cri.go:89] found id: ""
	I1218 01:29:41.332144 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.332152 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:41.332158 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:41.332261 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:41.359012 1340150 cri.go:89] found id: ""
	I1218 01:29:41.359034 1340150 logs.go:282] 0 containers: []
	W1218 01:29:41.359043 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:41.359052 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:41.359064 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:41.427046 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:41.427130 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:41.448366 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:41.448392 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:41.521145 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:41.521164 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:41.521176 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:41.552835 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:41.552871 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:44.081086 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:44.090860 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:44.090928 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:44.119065 1340150 cri.go:89] found id: ""
	I1218 01:29:44.119088 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.119096 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:44.119102 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:44.119206 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:44.143583 1340150 cri.go:89] found id: ""
	I1218 01:29:44.143607 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.143617 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:44.143623 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:44.143681 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:44.168113 1340150 cri.go:89] found id: ""
	I1218 01:29:44.168136 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.168145 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:44.168151 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:44.168211 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:44.192852 1340150 cri.go:89] found id: ""
	I1218 01:29:44.192874 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.192883 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:44.192889 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:44.192951 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:44.217123 1340150 cri.go:89] found id: ""
	I1218 01:29:44.217145 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.217153 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:44.217159 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:44.217215 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:44.242684 1340150 cri.go:89] found id: ""
	I1218 01:29:44.242706 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.242714 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:44.242721 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:44.242778 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:44.270670 1340150 cri.go:89] found id: ""
	I1218 01:29:44.270692 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.270701 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:44.270707 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:44.270836 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:44.295091 1340150 cri.go:89] found id: ""
	I1218 01:29:44.295114 1340150 logs.go:282] 0 containers: []
	W1218 01:29:44.295122 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:44.295131 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:44.295143 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:44.360730 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:44.360751 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:44.360763 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:44.390758 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:44.390783 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:44.427529 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:44.427601 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:44.501976 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:44.502013 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:47.019032 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:47.029273 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:47.029347 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:47.058644 1340150 cri.go:89] found id: ""
	I1218 01:29:47.058666 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.058674 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:47.058681 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:47.058738 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:47.086994 1340150 cri.go:89] found id: ""
	I1218 01:29:47.087017 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.087026 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:47.087032 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:47.087089 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:47.116397 1340150 cri.go:89] found id: ""
	I1218 01:29:47.116418 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.116427 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:47.116433 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:47.116491 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:47.149506 1340150 cri.go:89] found id: ""
	I1218 01:29:47.149527 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.149536 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:47.149542 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:47.149607 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:47.173806 1340150 cri.go:89] found id: ""
	I1218 01:29:47.173828 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.173837 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:47.173843 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:47.173901 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:47.198877 1340150 cri.go:89] found id: ""
	I1218 01:29:47.198950 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.198979 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:47.199005 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:47.199080 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:47.223920 1340150 cri.go:89] found id: ""
	I1218 01:29:47.223941 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.223953 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:47.223966 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:47.224047 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:47.249609 1340150 cri.go:89] found id: ""
	I1218 01:29:47.249633 1340150 logs.go:282] 0 containers: []
	W1218 01:29:47.249642 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:47.249651 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:47.249666 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:47.320360 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:47.320398 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:47.336418 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:47.336450 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:47.413665 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:47.413685 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:47.413708 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:47.457152 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:47.457187 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:49.992556 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:50.002349 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:50.002422 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:50.053699 1340150 cri.go:89] found id: ""
	I1218 01:29:50.053721 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.053731 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:50.053737 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:50.053800 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:50.079584 1340150 cri.go:89] found id: ""
	I1218 01:29:50.079609 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.079619 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:50.079625 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:50.079688 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:50.108671 1340150 cri.go:89] found id: ""
	I1218 01:29:50.108695 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.108705 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:50.108711 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:50.108773 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:50.134860 1340150 cri.go:89] found id: ""
	I1218 01:29:50.134890 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.134901 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:50.134908 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:50.134969 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:50.159876 1340150 cri.go:89] found id: ""
	I1218 01:29:50.159903 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.159912 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:50.159919 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:50.159977 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:50.190277 1340150 cri.go:89] found id: ""
	I1218 01:29:50.190299 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.190307 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:50.190314 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:50.190377 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:50.215312 1340150 cri.go:89] found id: ""
	I1218 01:29:50.215389 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.215421 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:50.215440 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:50.215523 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:50.241454 1340150 cri.go:89] found id: ""
	I1218 01:29:50.241517 1340150 logs.go:282] 0 containers: []
	W1218 01:29:50.241535 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:50.241545 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:50.241558 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:50.310604 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:50.310640 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:50.326852 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:50.326880 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:50.395649 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:50.395670 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:50.395682 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:50.432059 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:50.432098 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:52.965621 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:52.980647 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:52.980722 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:53.020782 1340150 cri.go:89] found id: ""
	I1218 01:29:53.020810 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.020819 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:53.020825 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:53.020881 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:53.050061 1340150 cri.go:89] found id: ""
	I1218 01:29:53.050084 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.050093 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:53.050100 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:53.050162 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:53.086943 1340150 cri.go:89] found id: ""
	I1218 01:29:53.086965 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.086973 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:53.086979 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:53.087034 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:53.124198 1340150 cri.go:89] found id: ""
	I1218 01:29:53.124243 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.124252 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:53.124259 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:53.124316 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:53.169215 1340150 cri.go:89] found id: ""
	I1218 01:29:53.169237 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.169246 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:53.169253 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:53.169311 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:53.201552 1340150 cri.go:89] found id: ""
	I1218 01:29:53.201573 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.201582 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:53.201589 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:53.201646 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:53.230959 1340150 cri.go:89] found id: ""
	I1218 01:29:53.231032 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.231056 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:53.231075 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:53.231147 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:53.266209 1340150 cri.go:89] found id: ""
	I1218 01:29:53.266231 1340150 logs.go:282] 0 containers: []
	W1218 01:29:53.266239 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:53.266249 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:53.266261 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:53.342662 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:53.342738 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:53.359043 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:53.359068 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:53.454177 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:53.454194 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:53.454205 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:53.492726 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:53.492803 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:56.059179 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:56.071034 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:56.071104 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:56.106062 1340150 cri.go:89] found id: ""
	I1218 01:29:56.106085 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.106094 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:56.106100 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:56.106163 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:56.131399 1340150 cri.go:89] found id: ""
	I1218 01:29:56.131421 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.131430 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:56.131436 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:56.131493 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:56.157066 1340150 cri.go:89] found id: ""
	I1218 01:29:56.157088 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.157097 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:56.157102 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:56.157160 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:56.183183 1340150 cri.go:89] found id: ""
	I1218 01:29:56.183206 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.183216 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:56.183222 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:56.183288 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:56.210347 1340150 cri.go:89] found id: ""
	I1218 01:29:56.210369 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.210378 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:56.210383 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:56.210448 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:56.235424 1340150 cri.go:89] found id: ""
	I1218 01:29:56.235446 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.235454 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:56.235460 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:56.235524 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:56.260492 1340150 cri.go:89] found id: ""
	I1218 01:29:56.260515 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.260523 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:56.260529 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:56.260596 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:56.292777 1340150 cri.go:89] found id: ""
	I1218 01:29:56.292802 1340150 logs.go:282] 0 containers: []
	W1218 01:29:56.292810 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:56.292819 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:56.292831 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:56.308973 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:56.309002 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:56.373455 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:56.373476 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:56.373488 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:56.405371 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:56.405402 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:29:56.435552 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:56.435579 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:59.004704 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:29:59.015133 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:29:59.015204 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:29:59.044672 1340150 cri.go:89] found id: ""
	I1218 01:29:59.044694 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.044703 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:29:59.044710 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:29:59.044775 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:29:59.077664 1340150 cri.go:89] found id: ""
	I1218 01:29:59.077687 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.077696 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:29:59.077702 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:29:59.077759 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:29:59.105280 1340150 cri.go:89] found id: ""
	I1218 01:29:59.105302 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.105310 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:29:59.105321 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:29:59.105378 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:29:59.130073 1340150 cri.go:89] found id: ""
	I1218 01:29:59.130096 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.130106 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:29:59.130113 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:29:59.130169 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:29:59.157572 1340150 cri.go:89] found id: ""
	I1218 01:29:59.157592 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.157601 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:29:59.157607 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:29:59.157663 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:29:59.181785 1340150 cri.go:89] found id: ""
	I1218 01:29:59.181823 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.181832 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:29:59.181839 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:29:59.181897 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:29:59.207562 1340150 cri.go:89] found id: ""
	I1218 01:29:59.207583 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.207592 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:29:59.207598 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:29:59.207657 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:29:59.233118 1340150 cri.go:89] found id: ""
	I1218 01:29:59.233140 1340150 logs.go:282] 0 containers: []
	W1218 01:29:59.233149 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:29:59.233158 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:29:59.233169 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:29:59.305061 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:29:59.305095 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:29:59.321233 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:29:59.321261 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:29:59.388269 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:29:59.388286 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:29:59.388298 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:29:59.418482 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:29:59.418516 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:01.944874 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:01.955953 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:01.956028 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:01.986886 1340150 cri.go:89] found id: ""
	I1218 01:30:01.986910 1340150 logs.go:282] 0 containers: []
	W1218 01:30:01.986920 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:01.986927 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:01.987005 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:02.018942 1340150 cri.go:89] found id: ""
	I1218 01:30:02.018964 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.018972 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:02.018978 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:02.019037 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:02.050192 1340150 cri.go:89] found id: ""
	I1218 01:30:02.050215 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.050223 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:02.050230 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:02.050292 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:02.087235 1340150 cri.go:89] found id: ""
	I1218 01:30:02.087312 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.087422 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:02.087448 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:02.087535 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:02.123510 1340150 cri.go:89] found id: ""
	I1218 01:30:02.123536 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.123547 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:02.123553 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:02.123618 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:02.153107 1340150 cri.go:89] found id: ""
	I1218 01:30:02.153130 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.153139 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:02.153148 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:02.153206 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:02.186739 1340150 cri.go:89] found id: ""
	I1218 01:30:02.186762 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.186777 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:02.186785 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:02.186861 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:02.216366 1340150 cri.go:89] found id: ""
	I1218 01:30:02.216388 1340150 logs.go:282] 0 containers: []
	W1218 01:30:02.216397 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:02.216406 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:02.216418 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:02.286611 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:02.286646 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:02.303417 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:02.303447 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:02.372951 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:02.372974 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:02.372987 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:02.404117 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:02.404152 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:04.935867 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:04.947381 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:04.947506 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:04.973074 1340150 cri.go:89] found id: ""
	I1218 01:30:04.973099 1340150 logs.go:282] 0 containers: []
	W1218 01:30:04.973107 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:04.973114 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:04.973198 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:04.998550 1340150 cri.go:89] found id: ""
	I1218 01:30:04.998615 1340150 logs.go:282] 0 containers: []
	W1218 01:30:04.998624 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:04.998631 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:04.998814 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:05.037601 1340150 cri.go:89] found id: ""
	I1218 01:30:05.037625 1340150 logs.go:282] 0 containers: []
	W1218 01:30:05.037634 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:05.037640 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:05.037723 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:05.069229 1340150 cri.go:89] found id: ""
	I1218 01:30:05.069254 1340150 logs.go:282] 0 containers: []
	W1218 01:30:05.069313 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:05.069320 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:05.069402 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:05.100933 1340150 cri.go:89] found id: ""
	I1218 01:30:05.100957 1340150 logs.go:282] 0 containers: []
	W1218 01:30:05.100966 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:05.100973 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:05.101063 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:05.130159 1340150 cri.go:89] found id: ""
	I1218 01:30:05.130237 1340150 logs.go:282] 0 containers: []
	W1218 01:30:05.130262 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:05.130281 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:05.130347 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:05.155969 1340150 cri.go:89] found id: ""
	I1218 01:30:05.156018 1340150 logs.go:282] 0 containers: []
	W1218 01:30:05.156031 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:05.156039 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:05.156117 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:05.183702 1340150 cri.go:89] found id: ""
	I1218 01:30:05.183729 1340150 logs.go:282] 0 containers: []
	W1218 01:30:05.183746 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:05.183756 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:05.183774 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:05.218997 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:05.219026 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:05.289161 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:05.289200 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:05.305495 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:05.305523 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:05.369194 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:05.369216 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:05.369228 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:07.900376 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:07.911141 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:07.911220 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:07.945097 1340150 cri.go:89] found id: ""
	I1218 01:30:07.945125 1340150 logs.go:282] 0 containers: []
	W1218 01:30:07.945135 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:07.945141 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:07.945198 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:07.975100 1340150 cri.go:89] found id: ""
	I1218 01:30:07.975126 1340150 logs.go:282] 0 containers: []
	W1218 01:30:07.975135 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:07.975142 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:07.975196 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:08.003656 1340150 cri.go:89] found id: ""
	I1218 01:30:08.003694 1340150 logs.go:282] 0 containers: []
	W1218 01:30:08.003705 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:08.003712 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:08.003784 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:08.031562 1340150 cri.go:89] found id: ""
	I1218 01:30:08.031601 1340150 logs.go:282] 0 containers: []
	W1218 01:30:08.031609 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:08.031616 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:08.031685 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:08.058818 1340150 cri.go:89] found id: ""
	I1218 01:30:08.058845 1340150 logs.go:282] 0 containers: []
	W1218 01:30:08.058862 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:08.058868 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:08.058941 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:08.091800 1340150 cri.go:89] found id: ""
	I1218 01:30:08.091837 1340150 logs.go:282] 0 containers: []
	W1218 01:30:08.091846 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:08.091853 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:08.091923 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:08.121967 1340150 cri.go:89] found id: ""
	I1218 01:30:08.121995 1340150 logs.go:282] 0 containers: []
	W1218 01:30:08.122004 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:08.122010 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:08.122069 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:08.146288 1340150 cri.go:89] found id: ""
	I1218 01:30:08.146363 1340150 logs.go:282] 0 containers: []
	W1218 01:30:08.146386 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:08.146410 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:08.146443 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:08.217470 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:08.217513 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:08.233958 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:08.233986 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:08.300364 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:08.300438 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:08.300465 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:08.331762 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:08.331792 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:10.861277 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:10.871137 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:10.871204 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:10.909162 1340150 cri.go:89] found id: ""
	I1218 01:30:10.909222 1340150 logs.go:282] 0 containers: []
	W1218 01:30:10.909244 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:10.909263 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:10.909336 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:10.936480 1340150 cri.go:89] found id: ""
	I1218 01:30:10.936502 1340150 logs.go:282] 0 containers: []
	W1218 01:30:10.936526 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:10.936532 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:10.936590 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:10.968823 1340150 cri.go:89] found id: ""
	I1218 01:30:10.968844 1340150 logs.go:282] 0 containers: []
	W1218 01:30:10.968852 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:10.968858 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:10.968914 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:10.993884 1340150 cri.go:89] found id: ""
	I1218 01:30:10.993910 1340150 logs.go:282] 0 containers: []
	W1218 01:30:10.993919 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:10.993926 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:10.993985 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:11.020548 1340150 cri.go:89] found id: ""
	I1218 01:30:11.020575 1340150 logs.go:282] 0 containers: []
	W1218 01:30:11.020584 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:11.020591 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:11.020650 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:11.046887 1340150 cri.go:89] found id: ""
	I1218 01:30:11.046914 1340150 logs.go:282] 0 containers: []
	W1218 01:30:11.046922 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:11.046929 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:11.046988 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:11.080017 1340150 cri.go:89] found id: ""
	I1218 01:30:11.080040 1340150 logs.go:282] 0 containers: []
	W1218 01:30:11.080049 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:11.080064 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:11.080123 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:11.111311 1340150 cri.go:89] found id: ""
	I1218 01:30:11.111337 1340150 logs.go:282] 0 containers: []
	W1218 01:30:11.111345 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:11.111362 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:11.111376 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:11.180045 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:11.180775 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:11.196979 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:11.197006 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:11.262549 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:11.262568 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:11.262580 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:11.293911 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:11.293951 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:13.824728 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:13.835012 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:13.835079 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:13.862421 1340150 cri.go:89] found id: ""
	I1218 01:30:13.862449 1340150 logs.go:282] 0 containers: []
	W1218 01:30:13.862458 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:13.862464 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:13.862525 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:13.887496 1340150 cri.go:89] found id: ""
	I1218 01:30:13.887523 1340150 logs.go:282] 0 containers: []
	W1218 01:30:13.887531 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:13.887538 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:13.887594 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:13.914700 1340150 cri.go:89] found id: ""
	I1218 01:30:13.914728 1340150 logs.go:282] 0 containers: []
	W1218 01:30:13.914737 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:13.914742 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:13.914801 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:13.941176 1340150 cri.go:89] found id: ""
	I1218 01:30:13.941205 1340150 logs.go:282] 0 containers: []
	W1218 01:30:13.941214 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:13.941221 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:13.941280 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:13.968382 1340150 cri.go:89] found id: ""
	I1218 01:30:13.968409 1340150 logs.go:282] 0 containers: []
	W1218 01:30:13.968418 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:13.968424 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:13.968487 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:14.015008 1340150 cri.go:89] found id: ""
	I1218 01:30:14.015030 1340150 logs.go:282] 0 containers: []
	W1218 01:30:14.015039 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:14.015046 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:14.015111 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:14.042991 1340150 cri.go:89] found id: ""
	I1218 01:30:14.043014 1340150 logs.go:282] 0 containers: []
	W1218 01:30:14.043023 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:14.043029 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:14.043087 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:14.088460 1340150 cri.go:89] found id: ""
	I1218 01:30:14.088482 1340150 logs.go:282] 0 containers: []
	W1218 01:30:14.088491 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:14.088501 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:14.088514 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:14.170138 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:14.170215 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:14.193603 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:14.193629 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:14.283128 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:14.283187 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:14.283212 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:14.318941 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:14.318971 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:16.888473 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:16.919157 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:16.919274 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:16.979877 1340150 cri.go:89] found id: ""
	I1218 01:30:16.979910 1340150 logs.go:282] 0 containers: []
	W1218 01:30:16.979928 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:16.979934 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:16.980009 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:17.062072 1340150 cri.go:89] found id: ""
	I1218 01:30:17.062109 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.062124 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:17.062130 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:17.062228 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:17.111596 1340150 cri.go:89] found id: ""
	I1218 01:30:17.111619 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.111627 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:17.111633 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:17.111691 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:17.162492 1340150 cri.go:89] found id: ""
	I1218 01:30:17.162513 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.162522 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:17.162529 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:17.162584 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:17.214566 1340150 cri.go:89] found id: ""
	I1218 01:30:17.214598 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.214607 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:17.214613 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:17.214681 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:17.258699 1340150 cri.go:89] found id: ""
	I1218 01:30:17.258721 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.258730 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:17.258737 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:17.258795 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:17.310670 1340150 cri.go:89] found id: ""
	I1218 01:30:17.310692 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.310701 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:17.310707 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:17.310764 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:17.350476 1340150 cri.go:89] found id: ""
	I1218 01:30:17.350498 1340150 logs.go:282] 0 containers: []
	W1218 01:30:17.350506 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:17.350515 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:17.350526 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:17.385121 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:17.385195 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:17.421139 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:17.421163 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:17.500630 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:17.500697 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:17.519517 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:17.519543 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:17.616917 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:20.117496 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:20.127988 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:20.128057 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:20.169067 1340150 cri.go:89] found id: ""
	I1218 01:30:20.169098 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.169107 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:20.169129 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:20.169190 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:20.200421 1340150 cri.go:89] found id: ""
	I1218 01:30:20.200452 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.200461 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:20.200467 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:20.200526 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:20.230340 1340150 cri.go:89] found id: ""
	I1218 01:30:20.230366 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.230375 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:20.230382 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:20.230441 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:20.255400 1340150 cri.go:89] found id: ""
	I1218 01:30:20.255421 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.255429 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:20.255436 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:20.255494 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:20.281669 1340150 cri.go:89] found id: ""
	I1218 01:30:20.281717 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.281725 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:20.281731 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:20.281791 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:20.307669 1340150 cri.go:89] found id: ""
	I1218 01:30:20.307691 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.307700 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:20.307706 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:20.307781 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:20.337485 1340150 cri.go:89] found id: ""
	I1218 01:30:20.337511 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.337525 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:20.337532 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:20.337606 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:20.368597 1340150 cri.go:89] found id: ""
	I1218 01:30:20.368635 1340150 logs.go:282] 0 containers: []
	W1218 01:30:20.368650 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:20.368659 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:20.368671 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:20.397647 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:20.397674 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:20.463699 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:20.463734 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:20.480392 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:20.480420 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:20.549872 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:20.549890 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:20.549902 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:23.081091 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:23.093928 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:23.094001 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:23.122806 1340150 cri.go:89] found id: ""
	I1218 01:30:23.122834 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.122842 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:23.122848 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:23.122910 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:23.161736 1340150 cri.go:89] found id: ""
	I1218 01:30:23.161758 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.161773 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:23.161781 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:23.161840 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:23.191705 1340150 cri.go:89] found id: ""
	I1218 01:30:23.191726 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.191735 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:23.191741 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:23.191797 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:23.221467 1340150 cri.go:89] found id: ""
	I1218 01:30:23.221492 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.221501 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:23.221507 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:23.221569 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:23.250664 1340150 cri.go:89] found id: ""
	I1218 01:30:23.250686 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.250695 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:23.250701 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:23.250759 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:23.279873 1340150 cri.go:89] found id: ""
	I1218 01:30:23.279895 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.279903 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:23.279909 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:23.279970 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:23.305034 1340150 cri.go:89] found id: ""
	I1218 01:30:23.305056 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.305064 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:23.305070 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:23.305127 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:23.329941 1340150 cri.go:89] found id: ""
	I1218 01:30:23.329963 1340150 logs.go:282] 0 containers: []
	W1218 01:30:23.329973 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:23.329983 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:23.329993 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:23.360464 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:23.360496 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:23.390719 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:23.390747 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:23.457453 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:23.457492 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:23.475522 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:23.475567 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:23.545229 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:26.045544 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:26.056805 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:26.056884 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:26.086641 1340150 cri.go:89] found id: ""
	I1218 01:30:26.086680 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.086690 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:26.086697 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:26.086761 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:26.120636 1340150 cri.go:89] found id: ""
	I1218 01:30:26.120658 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.120666 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:26.120672 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:26.120730 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:26.159084 1340150 cri.go:89] found id: ""
	I1218 01:30:26.159109 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.159119 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:26.159125 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:26.159183 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:26.198785 1340150 cri.go:89] found id: ""
	I1218 01:30:26.198812 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.198820 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:26.198826 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:26.198924 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:26.227754 1340150 cri.go:89] found id: ""
	I1218 01:30:26.227780 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.227789 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:26.227795 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:26.227856 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:26.256648 1340150 cri.go:89] found id: ""
	I1218 01:30:26.256670 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.256678 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:26.256684 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:26.256741 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:26.280941 1340150 cri.go:89] found id: ""
	I1218 01:30:26.280971 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.280980 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:26.280986 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:26.281043 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:26.307625 1340150 cri.go:89] found id: ""
	I1218 01:30:26.307651 1340150 logs.go:282] 0 containers: []
	W1218 01:30:26.307660 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:26.307670 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:26.307682 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:26.374998 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:26.375035 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:26.391261 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:26.391287 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:26.455020 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:26.455038 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:26.455051 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:26.486676 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:26.486709 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:29.024610 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:29.034533 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:29.034599 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:29.068249 1340150 cri.go:89] found id: ""
	I1218 01:30:29.068273 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.068282 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:29.068288 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:29.068346 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:29.094028 1340150 cri.go:89] found id: ""
	I1218 01:30:29.094051 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.094060 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:29.094066 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:29.094125 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:29.122854 1340150 cri.go:89] found id: ""
	I1218 01:30:29.122880 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.122889 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:29.122932 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:29.122998 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:29.159961 1340150 cri.go:89] found id: ""
	I1218 01:30:29.159988 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.159997 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:29.160003 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:29.160059 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:29.190773 1340150 cri.go:89] found id: ""
	I1218 01:30:29.190801 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.190811 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:29.190817 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:29.190877 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:29.221357 1340150 cri.go:89] found id: ""
	I1218 01:30:29.221383 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.221391 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:29.221399 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:29.221488 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:29.247126 1340150 cri.go:89] found id: ""
	I1218 01:30:29.247190 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.247215 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:29.247234 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:29.247307 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:29.272355 1340150 cri.go:89] found id: ""
	I1218 01:30:29.272377 1340150 logs.go:282] 0 containers: []
	W1218 01:30:29.272386 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:29.272395 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:29.272409 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:29.338549 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:29.338567 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:29.338578 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:29.369941 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:29.369974 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:29.398718 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:29.398745 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:29.466583 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:29.466618 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:31.985476 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:31.996015 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:31.996084 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:32.023362 1340150 cri.go:89] found id: ""
	I1218 01:30:32.023402 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.023412 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:32.023417 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:32.023478 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:32.049437 1340150 cri.go:89] found id: ""
	I1218 01:30:32.049460 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.049469 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:32.049475 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:32.049533 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:32.080330 1340150 cri.go:89] found id: ""
	I1218 01:30:32.080356 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.080366 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:32.080373 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:32.080431 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:32.111975 1340150 cri.go:89] found id: ""
	I1218 01:30:32.112003 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.112012 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:32.112019 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:32.112094 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:32.139841 1340150 cri.go:89] found id: ""
	I1218 01:30:32.139871 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.139881 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:32.139887 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:32.139948 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:32.170472 1340150 cri.go:89] found id: ""
	I1218 01:30:32.170494 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.170503 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:32.170509 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:32.170565 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:32.205580 1340150 cri.go:89] found id: ""
	I1218 01:30:32.205601 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.205610 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:32.205616 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:32.205673 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:32.235743 1340150 cri.go:89] found id: ""
	I1218 01:30:32.235766 1340150 logs.go:282] 0 containers: []
	W1218 01:30:32.235776 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:32.235785 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:32.235796 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:32.303762 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:32.303797 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:32.320560 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:32.320593 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:32.389146 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:32.389217 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:32.389245 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:32.420335 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:32.420367 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:34.956839 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:34.971696 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:34.971782 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:35.031043 1340150 cri.go:89] found id: ""
	I1218 01:30:35.031066 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.031075 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:35.031082 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:35.031140 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:35.072029 1340150 cri.go:89] found id: ""
	I1218 01:30:35.072051 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.072060 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:35.072066 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:35.072126 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:35.106661 1340150 cri.go:89] found id: ""
	I1218 01:30:35.106682 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.106691 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:35.106697 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:35.106756 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:35.148974 1340150 cri.go:89] found id: ""
	I1218 01:30:35.148996 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.149005 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:35.149011 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:35.149070 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:35.236024 1340150 cri.go:89] found id: ""
	I1218 01:30:35.236045 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.236053 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:35.236059 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:35.236117 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:35.271896 1340150 cri.go:89] found id: ""
	I1218 01:30:35.271918 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.271926 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:35.271933 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:35.271989 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:35.305300 1340150 cri.go:89] found id: ""
	I1218 01:30:35.305324 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.305335 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:35.305342 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:35.305401 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:35.354537 1340150 cri.go:89] found id: ""
	I1218 01:30:35.354558 1340150 logs.go:282] 0 containers: []
	W1218 01:30:35.354567 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:35.354576 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:35.354588 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:35.422615 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:35.422636 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:35.422651 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:35.454120 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:35.454153 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:35.481564 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:35.481590 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:35.550973 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:35.551009 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:38.068388 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:38.081953 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:38.082031 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:38.121685 1340150 cri.go:89] found id: ""
	I1218 01:30:38.121712 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.121722 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:38.121728 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:38.121788 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:38.173981 1340150 cri.go:89] found id: ""
	I1218 01:30:38.174006 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.174015 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:38.174021 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:38.174081 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:38.221910 1340150 cri.go:89] found id: ""
	I1218 01:30:38.221936 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.221945 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:38.221951 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:38.222011 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:38.280598 1340150 cri.go:89] found id: ""
	I1218 01:30:38.280631 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.280642 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:38.280648 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:38.280714 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:38.311006 1340150 cri.go:89] found id: ""
	I1218 01:30:38.311031 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.311040 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:38.311046 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:38.311100 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:38.346066 1340150 cri.go:89] found id: ""
	I1218 01:30:38.346101 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.346110 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:38.346117 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:38.346185 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:38.399943 1340150 cri.go:89] found id: ""
	I1218 01:30:38.399964 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.399972 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:38.399978 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:38.400046 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:38.435696 1340150 cri.go:89] found id: ""
	I1218 01:30:38.435723 1340150 logs.go:282] 0 containers: []
	W1218 01:30:38.435732 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:38.435741 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:38.435757 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:38.525576 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:38.525594 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:38.525605 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:38.564212 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:38.564687 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:38.606263 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:38.606300 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:38.682111 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:38.682200 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:41.199771 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:41.209948 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:41.210014 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:41.235123 1340150 cri.go:89] found id: ""
	I1218 01:30:41.235186 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.235209 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:41.235228 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:41.235298 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:41.262274 1340150 cri.go:89] found id: ""
	I1218 01:30:41.262299 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.262307 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:41.262313 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:41.262379 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:41.287262 1340150 cri.go:89] found id: ""
	I1218 01:30:41.287287 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.287295 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:41.287302 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:41.287361 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:41.319792 1340150 cri.go:89] found id: ""
	I1218 01:30:41.319819 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.319828 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:41.319834 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:41.319906 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:41.358978 1340150 cri.go:89] found id: ""
	I1218 01:30:41.358999 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.359007 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:41.359013 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:41.359071 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:41.389221 1340150 cri.go:89] found id: ""
	I1218 01:30:41.389298 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.389320 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:41.389339 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:41.389427 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:41.426615 1340150 cri.go:89] found id: ""
	I1218 01:30:41.426696 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.426719 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:41.426739 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:41.426860 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:41.462442 1340150 cri.go:89] found id: ""
	I1218 01:30:41.462519 1340150 logs.go:282] 0 containers: []
	W1218 01:30:41.462543 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:41.462565 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:41.462604 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:41.500436 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:41.500529 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:41.538734 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:41.538812 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:41.616406 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:41.616480 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:41.636336 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:41.636361 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:41.733084 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:44.233296 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:44.243194 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:44.243257 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:44.267157 1340150 cri.go:89] found id: ""
	I1218 01:30:44.267183 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.267192 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:44.267199 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:44.267255 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:44.292585 1340150 cri.go:89] found id: ""
	I1218 01:30:44.292608 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.292617 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:44.292624 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:44.292683 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:44.318597 1340150 cri.go:89] found id: ""
	I1218 01:30:44.318620 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.318628 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:44.318635 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:44.318691 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:44.347180 1340150 cri.go:89] found id: ""
	I1218 01:30:44.347208 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.347217 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:44.347224 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:44.347281 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:44.376550 1340150 cri.go:89] found id: ""
	I1218 01:30:44.376624 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.376646 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:44.376660 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:44.376734 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:44.406813 1340150 cri.go:89] found id: ""
	I1218 01:30:44.406835 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.406844 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:44.406850 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:44.406909 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:44.431196 1340150 cri.go:89] found id: ""
	I1218 01:30:44.431223 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.431232 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:44.431239 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:44.431295 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:44.459550 1340150 cri.go:89] found id: ""
	I1218 01:30:44.459575 1340150 logs.go:282] 0 containers: []
	W1218 01:30:44.459584 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:44.459593 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:44.459616 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:44.491601 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:44.491631 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:44.558314 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:44.558352 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:44.574120 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:44.574149 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:44.639235 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:44.639267 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:44.639280 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:47.170269 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:47.182923 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:47.182992 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:47.213250 1340150 cri.go:89] found id: ""
	I1218 01:30:47.213274 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.213283 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:47.213289 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:47.213347 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:47.239477 1340150 cri.go:89] found id: ""
	I1218 01:30:47.239500 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.239509 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:47.239516 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:47.239580 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:47.264855 1340150 cri.go:89] found id: ""
	I1218 01:30:47.264886 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.264895 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:47.264901 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:47.264957 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:47.290854 1340150 cri.go:89] found id: ""
	I1218 01:30:47.290879 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.290887 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:47.290894 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:47.290954 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:47.320774 1340150 cri.go:89] found id: ""
	I1218 01:30:47.320797 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.320806 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:47.320813 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:47.320872 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:47.344934 1340150 cri.go:89] found id: ""
	I1218 01:30:47.344958 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.344967 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:47.344973 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:47.345041 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:47.371251 1340150 cri.go:89] found id: ""
	I1218 01:30:47.371273 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.371282 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:47.371288 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:47.371344 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:47.396928 1340150 cri.go:89] found id: ""
	I1218 01:30:47.396954 1340150 logs.go:282] 0 containers: []
	W1218 01:30:47.396963 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:47.396972 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:47.396984 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:47.429001 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:47.429027 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:47.501002 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:47.501042 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:47.518214 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:47.518242 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:47.587864 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:47.587882 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:47.587894 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:50.118957 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:50.129438 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:50.129511 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:50.167125 1340150 cri.go:89] found id: ""
	I1218 01:30:50.167153 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.167163 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:50.167176 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:50.167240 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:50.203339 1340150 cri.go:89] found id: ""
	I1218 01:30:50.203368 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.203377 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:50.203383 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:50.203449 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:50.236081 1340150 cri.go:89] found id: ""
	I1218 01:30:50.236109 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.236117 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:50.236123 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:50.236186 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:50.262921 1340150 cri.go:89] found id: ""
	I1218 01:30:50.262947 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.262956 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:50.262962 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:50.263027 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:50.290725 1340150 cri.go:89] found id: ""
	I1218 01:30:50.290752 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.290761 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:50.290767 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:50.290827 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:50.319594 1340150 cri.go:89] found id: ""
	I1218 01:30:50.319620 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.319629 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:50.319635 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:50.319692 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:50.343819 1340150 cri.go:89] found id: ""
	I1218 01:30:50.343857 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.343867 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:50.343873 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:50.343940 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:50.370478 1340150 cri.go:89] found id: ""
	I1218 01:30:50.370502 1340150 logs.go:282] 0 containers: []
	W1218 01:30:50.370511 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:50.370520 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:50.370531 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:50.386078 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:50.386146 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:50.451580 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:50.451645 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:50.451666 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:50.482255 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:50.482286 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:50.510390 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:50.510416 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:53.077397 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:53.087356 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:53.087435 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:53.112837 1340150 cri.go:89] found id: ""
	I1218 01:30:53.112861 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.112869 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:53.112876 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:53.112932 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:53.147904 1340150 cri.go:89] found id: ""
	I1218 01:30:53.147926 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.147935 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:53.147941 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:53.147999 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:53.181675 1340150 cri.go:89] found id: ""
	I1218 01:30:53.181698 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.181706 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:53.181712 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:53.181770 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:53.213047 1340150 cri.go:89] found id: ""
	I1218 01:30:53.213080 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.213090 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:53.213097 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:53.213153 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:53.238223 1340150 cri.go:89] found id: ""
	I1218 01:30:53.238245 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.238253 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:53.238260 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:53.238319 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:53.264413 1340150 cri.go:89] found id: ""
	I1218 01:30:53.264436 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.264445 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:53.264452 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:53.264508 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:53.288582 1340150 cri.go:89] found id: ""
	I1218 01:30:53.288607 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.288616 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:53.288623 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:53.288681 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:53.318702 1340150 cri.go:89] found id: ""
	I1218 01:30:53.318726 1340150 logs.go:282] 0 containers: []
	W1218 01:30:53.318734 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:53.318744 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:53.318755 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:53.334816 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:53.334845 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:53.398307 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:53.398373 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:53.398392 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:53.430897 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:53.430931 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:53.465467 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:53.465493 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:56.033736 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:56.044406 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:56.044480 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:56.095257 1340150 cri.go:89] found id: ""
	I1218 01:30:56.095285 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.095295 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:56.095302 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:56.095364 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:56.144189 1340150 cri.go:89] found id: ""
	I1218 01:30:56.144211 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.144262 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:56.144271 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:56.144334 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:56.200729 1340150 cri.go:89] found id: ""
	I1218 01:30:56.200755 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.200764 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:56.200770 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:56.200832 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:56.258017 1340150 cri.go:89] found id: ""
	I1218 01:30:56.258053 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.258062 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:56.258068 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:56.258135 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:56.290434 1340150 cri.go:89] found id: ""
	I1218 01:30:56.290459 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.290468 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:56.290474 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:56.290529 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:56.330532 1340150 cri.go:89] found id: ""
	I1218 01:30:56.330557 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.330572 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:56.330578 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:56.330636 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:56.357986 1340150 cri.go:89] found id: ""
	I1218 01:30:56.358011 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.358020 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:56.358026 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:56.358082 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:56.390367 1340150 cri.go:89] found id: ""
	I1218 01:30:56.390393 1340150 logs.go:282] 0 containers: []
	W1218 01:30:56.390401 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:56.390410 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:56.390421 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:56.424143 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:56.424177 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:56.454587 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:56.454618 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:56.530052 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:56.530091 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:56.546505 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:56.546535 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:56.637441 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:30:59.137688 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:30:59.150005 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:30:59.150152 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:30:59.179606 1340150 cri.go:89] found id: ""
	I1218 01:30:59.179682 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.179705 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:30:59.179725 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:30:59.179812 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:30:59.210229 1340150 cri.go:89] found id: ""
	I1218 01:30:59.210303 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.210326 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:30:59.210344 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:30:59.210426 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:30:59.238410 1340150 cri.go:89] found id: ""
	I1218 01:30:59.238486 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.238512 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:30:59.238535 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:30:59.238607 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:30:59.266579 1340150 cri.go:89] found id: ""
	I1218 01:30:59.266613 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.266623 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:30:59.266629 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:30:59.266688 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:30:59.293453 1340150 cri.go:89] found id: ""
	I1218 01:30:59.293482 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.293491 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:30:59.293505 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:30:59.293581 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:30:59.327237 1340150 cri.go:89] found id: ""
	I1218 01:30:59.327259 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.327268 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:30:59.327277 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:30:59.327333 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:30:59.354180 1340150 cri.go:89] found id: ""
	I1218 01:30:59.354208 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.354216 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:30:59.354222 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:30:59.354283 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:30:59.379772 1340150 cri.go:89] found id: ""
	I1218 01:30:59.379794 1340150 logs.go:282] 0 containers: []
	W1218 01:30:59.379803 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:30:59.379811 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:30:59.379824 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:30:59.410923 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:30:59.410960 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:30:59.439871 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:30:59.439896 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:30:59.508103 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:30:59.508144 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:30:59.524801 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:30:59.524829 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:30:59.599222 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:02.100665 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:02.110936 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:02.111003 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:02.136587 1340150 cri.go:89] found id: ""
	I1218 01:31:02.136612 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.136621 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:02.136627 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:02.136686 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:02.174776 1340150 cri.go:89] found id: ""
	I1218 01:31:02.174802 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.174812 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:02.174817 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:02.174879 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:02.214191 1340150 cri.go:89] found id: ""
	I1218 01:31:02.214212 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.214221 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:02.214227 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:02.214282 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:02.241676 1340150 cri.go:89] found id: ""
	I1218 01:31:02.241699 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.241709 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:02.241715 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:02.241775 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:02.271421 1340150 cri.go:89] found id: ""
	I1218 01:31:02.271449 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.271457 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:02.271463 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:02.271521 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:02.297228 1340150 cri.go:89] found id: ""
	I1218 01:31:02.297251 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.297259 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:02.297266 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:02.297321 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:02.326116 1340150 cri.go:89] found id: ""
	I1218 01:31:02.326137 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.326146 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:02.326152 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:02.326211 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:02.351113 1340150 cri.go:89] found id: ""
	I1218 01:31:02.351135 1340150 logs.go:282] 0 containers: []
	W1218 01:31:02.351144 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:02.351153 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:02.351164 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:02.418193 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:02.418225 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:02.434544 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:02.434570 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:02.499662 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:02.499681 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:02.499694 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:02.531071 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:02.531104 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:05.059721 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:05.070505 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:05.070576 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:05.097011 1340150 cri.go:89] found id: ""
	I1218 01:31:05.097039 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.097049 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:05.097056 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:05.097115 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:05.125145 1340150 cri.go:89] found id: ""
	I1218 01:31:05.125169 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.125178 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:05.125185 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:05.125241 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:05.173550 1340150 cri.go:89] found id: ""
	I1218 01:31:05.173577 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.173586 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:05.173592 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:05.173650 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:05.211855 1340150 cri.go:89] found id: ""
	I1218 01:31:05.211889 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.211899 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:05.211905 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:05.211978 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:05.241189 1340150 cri.go:89] found id: ""
	I1218 01:31:05.241211 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.241219 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:05.241225 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:05.241285 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:05.271328 1340150 cri.go:89] found id: ""
	I1218 01:31:05.271349 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.271358 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:05.271365 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:05.271430 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:05.297721 1340150 cri.go:89] found id: ""
	I1218 01:31:05.297743 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.297751 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:05.297757 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:05.297817 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:05.323784 1340150 cri.go:89] found id: ""
	I1218 01:31:05.323806 1340150 logs.go:282] 0 containers: []
	W1218 01:31:05.323814 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:05.323823 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:05.323834 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:05.390640 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:05.390672 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:05.407075 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:05.407101 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:05.471986 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:05.472012 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:05.472024 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:05.502890 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:05.502924 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:08.035177 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:08.049118 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:08.049194 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:08.081186 1340150 cri.go:89] found id: ""
	I1218 01:31:08.081208 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.081217 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:08.081223 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:08.081284 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:08.111060 1340150 cri.go:89] found id: ""
	I1218 01:31:08.111081 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.111089 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:08.111095 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:08.111152 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:08.138379 1340150 cri.go:89] found id: ""
	I1218 01:31:08.138404 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.138423 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:08.138447 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:08.138526 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:08.171560 1340150 cri.go:89] found id: ""
	I1218 01:31:08.171583 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.171592 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:08.171598 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:08.171663 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:08.214022 1340150 cri.go:89] found id: ""
	I1218 01:31:08.214046 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.214055 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:08.214061 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:08.214118 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:08.243863 1340150 cri.go:89] found id: ""
	I1218 01:31:08.243899 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.243908 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:08.243914 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:08.243983 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:08.270047 1340150 cri.go:89] found id: ""
	I1218 01:31:08.270071 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.270079 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:08.270086 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:08.270151 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:08.298885 1340150 cri.go:89] found id: ""
	I1218 01:31:08.298951 1340150 logs.go:282] 0 containers: []
	W1218 01:31:08.298977 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:08.298998 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:08.299042 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:08.367314 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:08.367382 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:08.367399 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:08.399398 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:08.399439 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:08.430244 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:08.430271 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:08.500352 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:08.500390 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:11.018374 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:11.028950 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:11.029022 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:11.057072 1340150 cri.go:89] found id: ""
	I1218 01:31:11.057098 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.057108 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:11.057115 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:11.057171 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:11.085717 1340150 cri.go:89] found id: ""
	I1218 01:31:11.085741 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.085750 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:11.085756 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:11.085814 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:11.111580 1340150 cri.go:89] found id: ""
	I1218 01:31:11.111607 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.111618 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:11.111624 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:11.111686 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:11.140708 1340150 cri.go:89] found id: ""
	I1218 01:31:11.140735 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.140745 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:11.140755 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:11.140814 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:11.173310 1340150 cri.go:89] found id: ""
	I1218 01:31:11.173331 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.173339 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:11.173346 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:11.173404 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:11.210043 1340150 cri.go:89] found id: ""
	I1218 01:31:11.210114 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.210136 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:11.210151 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:11.210234 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:11.236280 1340150 cri.go:89] found id: ""
	I1218 01:31:11.236343 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.236359 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:11.236368 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:11.236425 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:11.262784 1340150 cri.go:89] found id: ""
	I1218 01:31:11.262810 1340150 logs.go:282] 0 containers: []
	W1218 01:31:11.262826 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:11.262835 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:11.262847 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:11.280009 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:11.280049 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:11.351855 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:11.351877 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:11.351893 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:11.382897 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:11.382928 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:11.415869 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:11.415904 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:13.996354 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:14.009405 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:14.009487 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:14.064789 1340150 cri.go:89] found id: ""
	I1218 01:31:14.064872 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.064884 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:14.064932 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:14.065044 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:14.096743 1340150 cri.go:89] found id: ""
	I1218 01:31:14.096766 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.096775 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:14.096781 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:14.096840 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:14.122428 1340150 cri.go:89] found id: ""
	I1218 01:31:14.122451 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.122460 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:14.122467 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:14.122522 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:14.152129 1340150 cri.go:89] found id: ""
	I1218 01:31:14.152151 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.152161 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:14.152167 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:14.152256 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:14.197006 1340150 cri.go:89] found id: ""
	I1218 01:31:14.197033 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.197042 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:14.197048 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:14.197114 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:14.222986 1340150 cri.go:89] found id: ""
	I1218 01:31:14.223009 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.223018 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:14.223024 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:14.223083 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:14.248589 1340150 cri.go:89] found id: ""
	I1218 01:31:14.248610 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.248619 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:14.248625 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:14.248684 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:14.279823 1340150 cri.go:89] found id: ""
	I1218 01:31:14.279844 1340150 logs.go:282] 0 containers: []
	W1218 01:31:14.279852 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:14.279860 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:14.279872 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:14.310285 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:14.310360 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:14.378501 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:14.378534 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:14.394545 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:14.394573 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:14.463098 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:14.463163 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:14.463187 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:16.993774 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:17.003944 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:17.004008 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:17.034529 1340150 cri.go:89] found id: ""
	I1218 01:31:17.034550 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.034559 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:17.034565 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:17.034625 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:17.078350 1340150 cri.go:89] found id: ""
	I1218 01:31:17.078372 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.078380 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:17.078386 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:17.078445 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:17.106523 1340150 cri.go:89] found id: ""
	I1218 01:31:17.106544 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.106552 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:17.106558 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:17.106621 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:17.136742 1340150 cri.go:89] found id: ""
	I1218 01:31:17.136763 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.136771 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:17.136784 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:17.136838 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:17.206397 1340150 cri.go:89] found id: ""
	I1218 01:31:17.206422 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.206446 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:17.206453 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:17.206526 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:17.261492 1340150 cri.go:89] found id: ""
	I1218 01:31:17.261518 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.261527 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:17.261534 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:17.261591 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:17.288481 1340150 cri.go:89] found id: ""
	I1218 01:31:17.288508 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.288517 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:17.288524 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:17.288580 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:17.315724 1340150 cri.go:89] found id: ""
	I1218 01:31:17.315751 1340150 logs.go:282] 0 containers: []
	W1218 01:31:17.315760 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:17.315769 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:17.315781 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:17.401053 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:17.401129 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:17.419778 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:17.419807 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:17.512199 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:17.512336 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:17.512378 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:17.557788 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:17.557876 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:20.107591 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:20.118549 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:20.118620 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:20.145128 1340150 cri.go:89] found id: ""
	I1218 01:31:20.145149 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.145158 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:20.145164 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:20.145235 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:20.174336 1340150 cri.go:89] found id: ""
	I1218 01:31:20.174364 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.174372 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:20.174378 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:20.174435 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:20.209235 1340150 cri.go:89] found id: ""
	I1218 01:31:20.209258 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.209273 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:20.209280 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:20.209338 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:20.235899 1340150 cri.go:89] found id: ""
	I1218 01:31:20.235921 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.235929 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:20.235935 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:20.235995 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:20.266087 1340150 cri.go:89] found id: ""
	I1218 01:31:20.266113 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.266123 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:20.266129 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:20.266232 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:20.293302 1340150 cri.go:89] found id: ""
	I1218 01:31:20.293325 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.293335 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:20.293342 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:20.293402 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:20.317759 1340150 cri.go:89] found id: ""
	I1218 01:31:20.317781 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.317790 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:20.317796 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:20.317862 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:20.343859 1340150 cri.go:89] found id: ""
	I1218 01:31:20.343881 1340150 logs.go:282] 0 containers: []
	W1218 01:31:20.343890 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:20.343900 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:20.343911 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:20.411737 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:20.411769 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:20.427994 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:20.428021 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:20.504447 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:20.504467 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:20.504479 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:20.536295 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:20.536331 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:23.065860 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:23.075854 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:23.075918 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:23.101535 1340150 cri.go:89] found id: ""
	I1218 01:31:23.101558 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.101567 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:23.101574 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:23.101630 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:23.127418 1340150 cri.go:89] found id: ""
	I1218 01:31:23.127452 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.127470 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:23.127476 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:23.127531 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:23.159067 1340150 cri.go:89] found id: ""
	I1218 01:31:23.159094 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.159103 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:23.159108 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:23.159163 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:23.192870 1340150 cri.go:89] found id: ""
	I1218 01:31:23.192897 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.192906 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:23.192912 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:23.192969 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:23.221969 1340150 cri.go:89] found id: ""
	I1218 01:31:23.221992 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.222001 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:23.222009 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:23.222162 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:23.246614 1340150 cri.go:89] found id: ""
	I1218 01:31:23.246639 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.246647 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:23.246654 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:23.246711 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:23.270864 1340150 cri.go:89] found id: ""
	I1218 01:31:23.270889 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.270897 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:23.270904 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:23.270975 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:23.299749 1340150 cri.go:89] found id: ""
	I1218 01:31:23.299773 1340150 logs.go:282] 0 containers: []
	W1218 01:31:23.299782 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:23.299791 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:23.299803 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:23.367938 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:23.367976 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:23.385226 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:23.385257 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:23.447735 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:23.447759 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:23.447773 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:23.478023 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:23.478055 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:26.013876 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:26.024849 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:26.024927 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:26.052971 1340150 cri.go:89] found id: ""
	I1218 01:31:26.053002 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.053011 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:26.053018 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:26.053078 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:26.087512 1340150 cri.go:89] found id: ""
	I1218 01:31:26.087538 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.087548 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:26.087554 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:26.087619 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:26.112962 1340150 cri.go:89] found id: ""
	I1218 01:31:26.112987 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.112996 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:26.113002 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:26.113061 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:26.144879 1340150 cri.go:89] found id: ""
	I1218 01:31:26.144901 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.144910 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:26.144917 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:26.144974 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:26.183947 1340150 cri.go:89] found id: ""
	I1218 01:31:26.183976 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.183985 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:26.183991 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:26.184046 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:26.213593 1340150 cri.go:89] found id: ""
	I1218 01:31:26.213616 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.213625 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:26.213632 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:26.213697 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:26.243111 1340150 cri.go:89] found id: ""
	I1218 01:31:26.243135 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.243144 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:26.243150 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:26.243206 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:26.268432 1340150 cri.go:89] found id: ""
	I1218 01:31:26.268461 1340150 logs.go:282] 0 containers: []
	W1218 01:31:26.268470 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:26.268479 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:26.268493 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:26.336352 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:26.336388 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:26.352016 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:26.352043 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:26.423027 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:26.423050 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:26.423063 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:26.453646 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:26.453680 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:28.988344 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:28.999651 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:28.999715 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:29.037721 1340150 cri.go:89] found id: ""
	I1218 01:31:29.037742 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.037751 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:29.037757 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:29.037814 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:29.079163 1340150 cri.go:89] found id: ""
	I1218 01:31:29.079183 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.079192 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:29.079198 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:29.079267 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:29.106718 1340150 cri.go:89] found id: ""
	I1218 01:31:29.106738 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.106747 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:29.106753 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:29.106820 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:29.137598 1340150 cri.go:89] found id: ""
	I1218 01:31:29.137620 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.137628 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:29.137634 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:29.137710 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:29.200029 1340150 cri.go:89] found id: ""
	I1218 01:31:29.200051 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.200059 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:29.200065 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:29.200122 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:29.258184 1340150 cri.go:89] found id: ""
	I1218 01:31:29.258206 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.258214 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:29.258220 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:29.258276 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:29.294640 1340150 cri.go:89] found id: ""
	I1218 01:31:29.294661 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.294670 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:29.294675 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:29.294732 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:29.333316 1340150 cri.go:89] found id: ""
	I1218 01:31:29.333341 1340150 logs.go:282] 0 containers: []
	W1218 01:31:29.333350 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:29.333362 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:29.333375 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:29.370446 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:29.370472 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:29.438796 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:29.438830 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:29.454404 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:29.454434 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:29.535966 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:29.535989 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:29.536005 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:32.067038 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:32.078392 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:32.078459 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:32.110789 1340150 cri.go:89] found id: ""
	I1218 01:31:32.110811 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.110819 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:32.110825 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:32.110883 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:32.153088 1340150 cri.go:89] found id: ""
	I1218 01:31:32.153113 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.153122 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:32.153128 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:32.153224 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:32.228799 1340150 cri.go:89] found id: ""
	I1218 01:31:32.228823 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.228831 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:32.228837 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:32.228892 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:32.274637 1340150 cri.go:89] found id: ""
	I1218 01:31:32.274660 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.274670 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:32.274676 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:32.274731 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:32.303240 1340150 cri.go:89] found id: ""
	I1218 01:31:32.303266 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.303275 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:32.303281 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:32.303337 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:32.335840 1340150 cri.go:89] found id: ""
	I1218 01:31:32.335865 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.335874 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:32.335881 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:32.335936 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:32.369569 1340150 cri.go:89] found id: ""
	I1218 01:31:32.369596 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.369605 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:32.369611 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:32.369667 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:32.409901 1340150 cri.go:89] found id: ""
	I1218 01:31:32.409926 1340150 logs.go:282] 0 containers: []
	W1218 01:31:32.409936 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:32.409944 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:32.409956 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:32.429535 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:32.429566 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:32.518913 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:32.518935 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:32.518947 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:32.556953 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:32.556988 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:32.594347 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:32.594374 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:35.172261 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:35.182870 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:35.182941 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:35.209218 1340150 cri.go:89] found id: ""
	I1218 01:31:35.209241 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.209250 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:35.209256 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:35.209311 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:35.235481 1340150 cri.go:89] found id: ""
	I1218 01:31:35.235504 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.235513 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:35.235519 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:35.235575 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:35.262793 1340150 cri.go:89] found id: ""
	I1218 01:31:35.262816 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.262827 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:35.262833 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:35.262888 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:35.290157 1340150 cri.go:89] found id: ""
	I1218 01:31:35.290181 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.290190 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:35.290196 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:35.290256 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:35.330608 1340150 cri.go:89] found id: ""
	I1218 01:31:35.330633 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.330642 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:35.330648 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:35.330710 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:35.363470 1340150 cri.go:89] found id: ""
	I1218 01:31:35.363492 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.363500 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:35.363507 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:35.363561 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:35.395190 1340150 cri.go:89] found id: ""
	I1218 01:31:35.395214 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.395222 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:35.395228 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:35.395287 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:35.429664 1340150 cri.go:89] found id: ""
	I1218 01:31:35.429691 1340150 logs.go:282] 0 containers: []
	W1218 01:31:35.429700 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:35.429710 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:35.429722 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:35.447845 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:35.447880 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:35.532320 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:35.532344 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:35.532355 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:35.563648 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:35.563683 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:35.601681 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:35.601709 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:38.178214 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:38.192665 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:38.192732 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:38.223303 1340150 cri.go:89] found id: ""
	I1218 01:31:38.223326 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.223335 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:38.223342 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:38.223400 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:38.250740 1340150 cri.go:89] found id: ""
	I1218 01:31:38.250767 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.250775 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:38.250782 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:38.250853 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:38.277476 1340150 cri.go:89] found id: ""
	I1218 01:31:38.277501 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.277510 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:38.277516 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:38.277574 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:38.303507 1340150 cri.go:89] found id: ""
	I1218 01:31:38.303531 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.303540 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:38.303547 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:38.303632 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:38.330161 1340150 cri.go:89] found id: ""
	I1218 01:31:38.330181 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.330190 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:38.330196 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:38.330257 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:38.354899 1340150 cri.go:89] found id: ""
	I1218 01:31:38.354921 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.354930 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:38.354936 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:38.354993 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:38.385814 1340150 cri.go:89] found id: ""
	I1218 01:31:38.385837 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.385847 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:38.385853 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:38.385916 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:38.412192 1340150 cri.go:89] found id: ""
	I1218 01:31:38.412238 1340150 logs.go:282] 0 containers: []
	W1218 01:31:38.412250 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:38.412259 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:38.412271 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:38.480530 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:38.480571 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:38.499318 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:38.499350 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:38.570272 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:38.570344 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:38.570373 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:38.601754 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:38.601811 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:41.137449 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:41.151047 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:41.151123 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:41.188487 1340150 cri.go:89] found id: ""
	I1218 01:31:41.188507 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.188516 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:41.188522 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:41.188578 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:41.222587 1340150 cri.go:89] found id: ""
	I1218 01:31:41.222608 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.222617 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:41.222623 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:41.222677 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:41.248606 1340150 cri.go:89] found id: ""
	I1218 01:31:41.248632 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.248641 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:41.248647 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:41.248708 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:41.278360 1340150 cri.go:89] found id: ""
	I1218 01:31:41.278383 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.278393 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:41.278399 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:41.278456 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:41.308660 1340150 cri.go:89] found id: ""
	I1218 01:31:41.308684 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.308693 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:41.308700 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:41.308756 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:41.338516 1340150 cri.go:89] found id: ""
	I1218 01:31:41.338540 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.338550 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:41.338556 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:41.338614 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:41.364039 1340150 cri.go:89] found id: ""
	I1218 01:31:41.364059 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.364068 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:41.364074 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:41.364131 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:41.389803 1340150 cri.go:89] found id: ""
	I1218 01:31:41.389827 1340150 logs.go:282] 0 containers: []
	W1218 01:31:41.389836 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:41.389846 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:41.389858 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:41.421119 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:41.421152 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:41.452237 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:41.452263 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:41.527669 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:41.527714 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:41.544249 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:41.544280 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:41.612389 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:44.112612 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:44.124512 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:44.124580 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:44.160545 1340150 cri.go:89] found id: ""
	I1218 01:31:44.160570 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.160582 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:44.160588 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:44.160654 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:44.191985 1340150 cri.go:89] found id: ""
	I1218 01:31:44.192009 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.192018 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:44.192024 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:44.192090 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:44.218068 1340150 cri.go:89] found id: ""
	I1218 01:31:44.218088 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.218097 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:44.218102 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:44.218158 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:44.246202 1340150 cri.go:89] found id: ""
	I1218 01:31:44.246226 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.246235 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:44.246241 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:44.246300 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:44.271836 1340150 cri.go:89] found id: ""
	I1218 01:31:44.271861 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.271870 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:44.271877 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:44.271933 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:44.296317 1340150 cri.go:89] found id: ""
	I1218 01:31:44.296347 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.296355 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:44.296362 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:44.296418 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:44.321345 1340150 cri.go:89] found id: ""
	I1218 01:31:44.321367 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.321375 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:44.321381 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:44.321437 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:44.353085 1340150 cri.go:89] found id: ""
	I1218 01:31:44.353105 1340150 logs.go:282] 0 containers: []
	W1218 01:31:44.353114 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:44.353124 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:44.353135 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:44.420572 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:44.420609 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:44.437247 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:44.437273 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:44.511058 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:44.511078 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:44.511091 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:44.542948 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:44.542984 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:47.076364 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:47.086466 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:31:47.086535 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:31:47.112116 1340150 cri.go:89] found id: ""
	I1218 01:31:47.112140 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.112149 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:31:47.112156 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:31:47.112252 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:31:47.140482 1340150 cri.go:89] found id: ""
	I1218 01:31:47.140507 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.140516 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:31:47.140529 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:31:47.140586 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:31:47.174753 1340150 cri.go:89] found id: ""
	I1218 01:31:47.174778 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.174787 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:31:47.174793 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:31:47.174858 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:31:47.211054 1340150 cri.go:89] found id: ""
	I1218 01:31:47.211078 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.211086 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:31:47.211092 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:31:47.211149 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:31:47.236908 1340150 cri.go:89] found id: ""
	I1218 01:31:47.236933 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.236942 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:31:47.236948 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:31:47.237033 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:31:47.262894 1340150 cri.go:89] found id: ""
	I1218 01:31:47.262918 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.262927 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:31:47.262933 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:31:47.262991 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:31:47.287588 1340150 cri.go:89] found id: ""
	I1218 01:31:47.287612 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.287622 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:31:47.287628 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:31:47.287685 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:31:47.312496 1340150 cri.go:89] found id: ""
	I1218 01:31:47.312518 1340150 logs.go:282] 0 containers: []
	W1218 01:31:47.312526 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:31:47.312535 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:31:47.312566 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:31:47.379667 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:31:47.379705 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:31:47.396478 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:31:47.396507 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:31:47.472828 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1218 01:31:47.472888 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:31:47.472907 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:31:47.504268 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:31:47.504303 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:31:50.032920 1340150 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:31:50.043395 1340150 kubeadm.go:602] duration metric: took 4m5.433364787s to restartPrimaryControlPlane
	W1218 01:31:50.043457 1340150 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1218 01:31:50.043524 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 01:31:50.466030 1340150 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:31:50.478924 1340150 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1218 01:31:50.487103 1340150 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 01:31:50.487166 1340150 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 01:31:50.495905 1340150 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 01:31:50.495925 1340150 kubeadm.go:158] found existing configuration files:
	
	I1218 01:31:50.496007 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1218 01:31:50.504329 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 01:31:50.504407 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 01:31:50.512086 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1218 01:31:50.520750 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 01:31:50.520841 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 01:31:50.528550 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1218 01:31:50.536594 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 01:31:50.536681 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 01:31:50.544421 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1218 01:31:50.552600 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 01:31:50.552675 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 01:31:50.560159 1340150 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 01:31:50.599490 1340150 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 01:31:50.599564 1340150 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 01:31:50.671924 1340150 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 01:31:50.672001 1340150 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 01:31:50.672042 1340150 kubeadm.go:319] OS: Linux
	I1218 01:31:50.672092 1340150 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 01:31:50.672144 1340150 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 01:31:50.672196 1340150 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 01:31:50.672266 1340150 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 01:31:50.672320 1340150 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 01:31:50.672370 1340150 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 01:31:50.672419 1340150 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 01:31:50.672471 1340150 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 01:31:50.672524 1340150 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 01:31:50.745044 1340150 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 01:31:50.745174 1340150 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 01:31:50.745283 1340150 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 01:31:50.760624 1340150 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 01:31:50.763886 1340150 out.go:252]   - Generating certificates and keys ...
	I1218 01:31:50.764044 1340150 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 01:31:50.764157 1340150 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 01:31:50.764321 1340150 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 01:31:50.764441 1340150 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 01:31:50.764549 1340150 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 01:31:50.764656 1340150 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 01:31:50.764763 1340150 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 01:31:50.764870 1340150 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 01:31:50.764955 1340150 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 01:31:50.765034 1340150 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 01:31:50.765075 1340150 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 01:31:50.765135 1340150 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 01:31:51.045498 1340150 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 01:31:51.599133 1340150 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 01:31:51.687892 1340150 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 01:31:52.246741 1340150 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 01:31:52.481346 1340150 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 01:31:52.482152 1340150 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 01:31:52.484940 1340150 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 01:31:52.488207 1340150 out.go:252]   - Booting up control plane ...
	I1218 01:31:52.488344 1340150 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 01:31:52.490101 1340150 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 01:31:52.491735 1340150 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 01:31:52.515848 1340150 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 01:31:52.515979 1340150 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 01:31:52.530384 1340150 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 01:31:52.531192 1340150 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 01:31:52.531539 1340150 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 01:31:52.734380 1340150 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 01:31:52.734514 1340150 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 01:35:52.734955 1340150 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001126685s
	I1218 01:35:52.737534 1340150 kubeadm.go:319] 
	I1218 01:35:52.737594 1340150 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 01:35:52.737629 1340150 kubeadm.go:319] 	- The kubelet is not running
	I1218 01:35:52.737734 1340150 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 01:35:52.737739 1340150 kubeadm.go:319] 
	I1218 01:35:52.737843 1340150 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 01:35:52.737877 1340150 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 01:35:52.737908 1340150 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 01:35:52.737913 1340150 kubeadm.go:319] 
	I1218 01:35:52.738834 1340150 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 01:35:52.739286 1340150 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 01:35:52.739403 1340150 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 01:35:52.739656 1340150 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 01:35:52.739663 1340150 kubeadm.go:319] 
	I1218 01:35:52.739736 1340150 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1218 01:35:52.739916 1340150 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001126685s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001126685s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1218 01:35:52.740003 1340150 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /var/run/crio/crio.sock --force"
	I1218 01:35:53.168268 1340150 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:35:53.181516 1340150 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1218 01:35:53.181583 1340150 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1218 01:35:53.191611 1340150 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1218 01:35:53.191631 1340150 kubeadm.go:158] found existing configuration files:
	
	I1218 01:35:53.191679 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1218 01:35:53.200543 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1218 01:35:53.200605 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1218 01:35:53.209044 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1218 01:35:53.216804 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1218 01:35:53.216873 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1218 01:35:53.225576 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1218 01:35:53.234009 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1218 01:35:53.234072 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1218 01:35:53.241198 1340150 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1218 01:35:53.248910 1340150 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1218 01:35:53.248976 1340150 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1218 01:35:53.256474 1340150 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1218 01:35:53.297685 1340150 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1218 01:35:53.297780 1340150 kubeadm.go:319] [preflight] Running pre-flight checks
	I1218 01:35:53.367173 1340150 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1218 01:35:53.367250 1340150 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1218 01:35:53.367289 1340150 kubeadm.go:319] OS: Linux
	I1218 01:35:53.367334 1340150 kubeadm.go:319] CGROUPS_CPU: enabled
	I1218 01:35:53.367382 1340150 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1218 01:35:53.367429 1340150 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1218 01:35:53.367477 1340150 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1218 01:35:53.367525 1340150 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1218 01:35:53.367577 1340150 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1218 01:35:53.367623 1340150 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1218 01:35:53.367671 1340150 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1218 01:35:53.367717 1340150 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1218 01:35:53.432687 1340150 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1218 01:35:53.432856 1340150 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1218 01:35:53.433000 1340150 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1218 01:35:53.440339 1340150 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1218 01:35:53.445935 1340150 out.go:252]   - Generating certificates and keys ...
	I1218 01:35:53.446086 1340150 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1218 01:35:53.446192 1340150 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1218 01:35:53.446313 1340150 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1218 01:35:53.446418 1340150 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1218 01:35:53.446523 1340150 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1218 01:35:53.446597 1340150 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1218 01:35:53.446707 1340150 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1218 01:35:53.446784 1340150 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1218 01:35:53.446864 1340150 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1218 01:35:53.446940 1340150 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1218 01:35:53.446981 1340150 kubeadm.go:319] [certs] Using the existing "sa" key
	I1218 01:35:53.447041 1340150 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1218 01:35:53.578183 1340150 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1218 01:35:54.401565 1340150 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1218 01:35:54.489201 1340150 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1218 01:35:54.660854 1340150 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1218 01:35:55.227517 1340150 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1218 01:35:55.228304 1340150 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1218 01:35:55.231562 1340150 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1218 01:35:55.234773 1340150 out.go:252]   - Booting up control plane ...
	I1218 01:35:55.234922 1340150 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1218 01:35:55.235041 1340150 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1218 01:35:55.235865 1340150 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1218 01:35:55.251137 1340150 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1218 01:35:55.251297 1340150 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1218 01:35:55.258554 1340150 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1218 01:35:55.259085 1340150 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1218 01:35:55.259153 1340150 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1218 01:35:55.386392 1340150 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1218 01:35:55.386517 1340150 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1218 01:39:55.386677 1340150 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000318573s
	I1218 01:39:55.386711 1340150 kubeadm.go:319] 
	I1218 01:39:55.386801 1340150 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 01:39:55.386856 1340150 kubeadm.go:319] 	- The kubelet is not running
	I1218 01:39:55.386982 1340150 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 01:39:55.386992 1340150 kubeadm.go:319] 
	I1218 01:39:55.387097 1340150 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 01:39:55.387129 1340150 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 01:39:55.387160 1340150 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 01:39:55.387164 1340150 kubeadm.go:319] 
	I1218 01:39:55.391847 1340150 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 01:39:55.392362 1340150 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 01:39:55.392500 1340150 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 01:39:55.392761 1340150 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 01:39:55.392774 1340150 kubeadm.go:319] 
	I1218 01:39:55.392848 1340150 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 01:39:55.392914 1340150 kubeadm.go:403] duration metric: took 12m10.818401776s to StartCluster
	I1218 01:39:55.392964 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:39:55.393031 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:39:55.457289 1340150 cri.go:89] found id: ""
	I1218 01:39:55.457323 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.457336 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:39:55.457343 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:39:55.457411 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:39:55.505387 1340150 cri.go:89] found id: ""
	I1218 01:39:55.505410 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.505419 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:39:55.505430 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:39:55.505538 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:39:55.546732 1340150 cri.go:89] found id: ""
	I1218 01:39:55.546756 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.546766 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:39:55.546780 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:39:55.546838 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:39:55.578795 1340150 cri.go:89] found id: ""
	I1218 01:39:55.578821 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.578830 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:39:55.578837 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:39:55.578894 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:39:55.615657 1340150 cri.go:89] found id: ""
	I1218 01:39:55.615683 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.615693 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:39:55.615699 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:39:55.615765 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:39:55.663057 1340150 cri.go:89] found id: ""
	I1218 01:39:55.663098 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.663107 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:39:55.663114 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:39:55.663175 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:39:55.697387 1340150 cri.go:89] found id: ""
	I1218 01:39:55.697414 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.697424 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:39:55.697430 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:39:55.697493 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:39:55.728614 1340150 cri.go:89] found id: ""
	I1218 01:39:55.728661 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.728675 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:39:55.728695 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:39:55.728746 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:39:55.764321 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:39:55.764399 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:39:55.803982 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:39:55.804056 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:39:55.897653 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:39:55.897732 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:39:55.915692 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:39:55.915717 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:39:56.022499 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	W1218 01:39:56.022578 1340150 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 01:39:56.022632 1340150 out.go:285] * 
	* 
	W1218 01:39:56.022747 1340150 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 01:39:56.022812 1340150 out.go:285] * 
	* 
	W1218 01:39:56.025404 1340150 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 01:39:56.031643 1340150 out.go:203] 
	W1218 01:39:56.034657 1340150 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 01:39:56.034816 1340150 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 01:39:56.034877 1340150 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	* Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 01:39:56.038133 1340150 out.go:203] 

                                                
                                                
** /stderr **
version_upgrade_test.go:245: failed to upgrade with newest k8s version. args: out/minikube-linux-arm64 start -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio : exit status 109
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-823559 version --output=json
version_upgrade_test.go:248: (dbg) Non-zero exit: kubectl --context kubernetes-upgrade-823559 version --output=json: exit status 1 (144.244855ms)

                                                
                                                
-- stdout --
	{
	  "clientVersion": {
	    "major": "1",
	    "minor": "33",
	    "gitVersion": "v1.33.2",
	    "gitCommit": "a57b6f7709f6c2722b92f07b8b4c48210a51fc40",
	    "gitTreeState": "clean",
	    "buildDate": "2025-06-17T18:41:31Z",
	    "goVersion": "go1.24.4",
	    "compiler": "gc",
	    "platform": "linux/arm64"
	  },
	  "kustomizeVersion": "v5.6.0"
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.76.2:8443 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
version_upgrade_test.go:250: error running kubectl: exit status 1
panic.go:615: *** TestKubernetesUpgrade FAILED at 2025-12-18 01:39:56.960647708 +0000 UTC m=+5277.676811846
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestKubernetesUpgrade]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestKubernetesUpgrade]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect kubernetes-upgrade-823559
helpers_test.go:244: (dbg) docker inspect kubernetes-upgrade-823559:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7",
	        "Created": "2025-12-18T01:26:51.90541821Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1340554,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T01:27:31.000911021Z",
	            "FinishedAt": "2025-12-18T01:27:29.673740315Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7/hostname",
	        "HostsPath": "/var/lib/docker/containers/48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7/hosts",
	        "LogPath": "/var/lib/docker/containers/48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7/48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7-json.log",
	        "Name": "/kubernetes-upgrade-823559",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "kubernetes-upgrade-823559:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "kubernetes-upgrade-823559",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "48f7f0d0914d4d50e0748b625a86f8d0e15d3b0027a830519ca6c22abd6422a7",
	                "LowerDir": "/var/lib/docker/overlay2/3cea5ed3a1d032262b9c0d40e25ddae8cedca5e691a44d20a33f646831bfcd10-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cea5ed3a1d032262b9c0d40e25ddae8cedca5e691a44d20a33f646831bfcd10/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cea5ed3a1d032262b9c0d40e25ddae8cedca5e691a44d20a33f646831bfcd10/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cea5ed3a1d032262b9c0d40e25ddae8cedca5e691a44d20a33f646831bfcd10/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "kubernetes-upgrade-823559",
	                "Source": "/var/lib/docker/volumes/kubernetes-upgrade-823559/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "kubernetes-upgrade-823559",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "kubernetes-upgrade-823559",
	                "name.minikube.sigs.k8s.io": "kubernetes-upgrade-823559",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "bd52c21eece13b765e0eb7d8ddeaeb755b241cdb56ad827deacae94ebe033b1a",
	            "SandboxKey": "/var/run/docker/netns/bd52c21eece1",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34155"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34156"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34159"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34157"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34158"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "kubernetes-upgrade-823559": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.76.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "6e:a4:e9:8f:c0:9f",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "2ee425327f59d15a369d961802f36099af157f95e6787feadb929d0da9d28930",
	                    "EndpointID": "8ef19acb15e8b6ee40638903c673151c8871d2be97644b5f5f0e987c81dbecc8",
	                    "Gateway": "192.168.76.1",
	                    "IPAddress": "192.168.76.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "kubernetes-upgrade-823559",
	                        "48f7f0d0914d"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-823559 -n kubernetes-upgrade-823559
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-823559 -n kubernetes-upgrade-823559: exit status 2 (431.430266ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestKubernetesUpgrade FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestKubernetesUpgrade]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-823559 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p kubernetes-upgrade-823559 logs -n 25: (1.05551543s)
helpers_test.go:261: TestKubernetesUpgrade logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                    ARGS                                                    │         PROFILE          │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ -p cilium-289935 sudo systemctl status kubelet --all --full --no-pager                                     │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl cat kubelet --no-pager                                                     │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo journalctl -xeu kubelet --all --full --no-pager                                      │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /etc/kubernetes/kubelet.conf                                                     │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /var/lib/kubelet/config.yaml                                                     │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl status docker --all --full --no-pager                                      │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl cat docker --no-pager                                                      │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /etc/docker/daemon.json                                                          │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo docker system info                                                                   │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl status cri-docker --all --full --no-pager                                  │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl cat cri-docker --no-pager                                                  │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /etc/systemd/system/cri-docker.service.d/10-cni.conf                             │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /usr/lib/systemd/system/cri-docker.service                                       │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cri-dockerd --version                                                                │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl status containerd --all --full --no-pager                                  │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl cat containerd --no-pager                                                  │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /lib/systemd/system/containerd.service                                           │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo cat /etc/containerd/config.toml                                                      │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo containerd config dump                                                               │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl status crio --all --full --no-pager                                        │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo systemctl cat crio --no-pager                                                        │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo find /etc/crio -type f -exec sh -c 'echo {}; cat {}' \;                              │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ ssh     │ -p cilium-289935 sudo crio config                                                                          │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	│ delete  │ -p cilium-289935                                                                                           │ cilium-289935            │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │ 18 Dec 25 01:39 UTC │
	│ start   │ -p force-systemd-env-066220 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio │ force-systemd-env-066220 │ jenkins │ v1.37.0 │ 18 Dec 25 01:39 UTC │                     │
	└─────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 01:39:44
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 01:39:44.421414 1379211 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:39:44.421664 1379211 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:39:44.421694 1379211 out.go:374] Setting ErrFile to fd 2...
	I1218 01:39:44.421712 1379211 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:39:44.422002 1379211 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:39:44.422446 1379211 out.go:368] Setting JSON to false
	I1218 01:39:44.423344 1379211 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":30133,"bootTime":1765991852,"procs":175,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 01:39:44.423429 1379211 start.go:143] virtualization:  
	I1218 01:39:44.426942 1379211 out.go:179] * [force-systemd-env-066220] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 01:39:44.429995 1379211 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 01:39:44.430073 1379211 notify.go:221] Checking for updates...
	I1218 01:39:44.433588 1379211 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 01:39:44.436388 1379211 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:39:44.439246 1379211 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 01:39:44.442150 1379211 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 01:39:44.444951 1379211 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=true
	I1218 01:39:44.448383 1379211 config.go:182] Loaded profile config "kubernetes-upgrade-823559": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 01:39:44.448482 1379211 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 01:39:44.474580 1379211 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 01:39:44.474701 1379211 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:39:44.531976 1379211 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 01:39:44.522564456 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:39:44.532083 1379211 docker.go:319] overlay module found
	I1218 01:39:44.537104 1379211 out.go:179] * Using the docker driver based on user configuration
	I1218 01:39:44.540050 1379211 start.go:309] selected driver: docker
	I1218 01:39:44.540069 1379211 start.go:927] validating driver "docker" against <nil>
	I1218 01:39:44.540082 1379211 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 01:39:44.540854 1379211 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:39:44.594314 1379211 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 01:39:44.585063015 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:39:44.594462 1379211 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 01:39:44.594684 1379211 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1218 01:39:44.597577 1379211 out.go:179] * Using Docker driver with root privileges
	I1218 01:39:44.600567 1379211 cni.go:84] Creating CNI manager for ""
	I1218 01:39:44.600626 1379211 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:39:44.600638 1379211 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 01:39:44.600727 1379211 start.go:353] cluster config:
	{Name:force-systemd-env-066220 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-066220 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.
local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:39:44.603761 1379211 out.go:179] * Starting "force-systemd-env-066220" primary control-plane node in "force-systemd-env-066220" cluster
	I1218 01:39:44.606608 1379211 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 01:39:44.609566 1379211 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 01:39:44.612581 1379211 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 01:39:44.612630 1379211 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 01:39:44.612643 1379211 cache.go:65] Caching tarball of preloaded images
	I1218 01:39:44.612654 1379211 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 01:39:44.612725 1379211 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 01:39:44.612735 1379211 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 01:39:44.612841 1379211 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/force-systemd-env-066220/config.json ...
	I1218 01:39:44.612857 1379211 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/force-systemd-env-066220/config.json: {Name:mkeac25b7355fa28601dfca574011c47110d708a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:39:44.631800 1379211 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 01:39:44.631826 1379211 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 01:39:44.631847 1379211 cache.go:243] Successfully downloaded all kic artifacts
	I1218 01:39:44.631878 1379211 start.go:360] acquireMachinesLock for force-systemd-env-066220: {Name:mk110ccb4f13047e4d3e691d19eadb821d64024c Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 01:39:44.631990 1379211 start.go:364] duration metric: took 92.264µs to acquireMachinesLock for "force-systemd-env-066220"
	I1218 01:39:44.632020 1379211 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-066220 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-066220 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 01:39:44.632087 1379211 start.go:125] createHost starting for "" (driver="docker")
	I1218 01:39:44.635470 1379211 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1218 01:39:44.635704 1379211 start.go:159] libmachine.API.Create for "force-systemd-env-066220" (driver="docker")
	I1218 01:39:44.635742 1379211 client.go:173] LocalClient.Create starting
	I1218 01:39:44.635815 1379211 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem
	I1218 01:39:44.635876 1379211 main.go:143] libmachine: Decoding PEM data...
	I1218 01:39:44.635903 1379211 main.go:143] libmachine: Parsing certificate...
	I1218 01:39:44.635949 1379211 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem
	I1218 01:39:44.635981 1379211 main.go:143] libmachine: Decoding PEM data...
	I1218 01:39:44.635996 1379211 main.go:143] libmachine: Parsing certificate...
	I1218 01:39:44.636375 1379211 cli_runner.go:164] Run: docker network inspect force-systemd-env-066220 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1218 01:39:44.651964 1379211 cli_runner.go:211] docker network inspect force-systemd-env-066220 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1218 01:39:44.652044 1379211 network_create.go:284] running [docker network inspect force-systemd-env-066220] to gather additional debugging logs...
	I1218 01:39:44.652063 1379211 cli_runner.go:164] Run: docker network inspect force-systemd-env-066220
	W1218 01:39:44.667219 1379211 cli_runner.go:211] docker network inspect force-systemd-env-066220 returned with exit code 1
	I1218 01:39:44.667245 1379211 network_create.go:287] error running [docker network inspect force-systemd-env-066220]: docker network inspect force-systemd-env-066220: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network force-systemd-env-066220 not found
	I1218 01:39:44.667258 1379211 network_create.go:289] output of [docker network inspect force-systemd-env-066220]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network force-systemd-env-066220 not found
	
	** /stderr **
	I1218 01:39:44.667367 1379211 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 01:39:44.683299 1379211 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-6457214f0a50 IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:0a:67:20:e8:65:78} reservation:<nil>}
	I1218 01:39:44.683581 1379211 network.go:211] skipping subnet 192.168.58.0/24 that is taken: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName:br-594399faec05 IfaceIPv4:192.168.58.1 IfaceMTU:1500 IfaceMAC:ce:09:30:83:16:af} reservation:<nil>}
	I1218 01:39:44.683834 1379211 network.go:211] skipping subnet 192.168.67.0/24 that is taken: &{IP:192.168.67.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.67.0/24 Gateway:192.168.67.1 ClientMin:192.168.67.2 ClientMax:192.168.67.254 Broadcast:192.168.67.255 IsPrivate:true Interface:{IfaceName:br-2e8c23a6124f IfaceIPv4:192.168.67.1 IfaceMTU:1500 IfaceMAC:d2:6b:4a:84:b3:77} reservation:<nil>}
	I1218 01:39:44.684162 1379211 network.go:211] skipping subnet 192.168.76.0/24 that is taken: &{IP:192.168.76.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.76.0/24 Gateway:192.168.76.1 ClientMin:192.168.76.2 ClientMax:192.168.76.254 Broadcast:192.168.76.255 IsPrivate:true Interface:{IfaceName:br-2ee425327f59 IfaceIPv4:192.168.76.1 IfaceMTU:1500 IfaceMAC:86:7b:a7:37:06:45} reservation:<nil>}
	I1218 01:39:44.684718 1379211 network.go:206] using free private subnet 192.168.85.0/24: &{IP:192.168.85.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.85.0/24 Gateway:192.168.85.1 ClientMin:192.168.85.2 ClientMax:192.168.85.254 Broadcast:192.168.85.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40019c1150}
	I1218 01:39:44.684744 1379211 network_create.go:124] attempt to create docker network force-systemd-env-066220 192.168.85.0/24 with gateway 192.168.85.1 and MTU of 1500 ...
	I1218 01:39:44.684798 1379211 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.85.0/24 --gateway=192.168.85.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=force-systemd-env-066220 force-systemd-env-066220
	I1218 01:39:44.736509 1379211 network_create.go:108] docker network force-systemd-env-066220 192.168.85.0/24 created
	I1218 01:39:44.736543 1379211 kic.go:121] calculated static IP "192.168.85.2" for the "force-systemd-env-066220" container
	I1218 01:39:44.736631 1379211 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1218 01:39:44.752557 1379211 cli_runner.go:164] Run: docker volume create force-systemd-env-066220 --label name.minikube.sigs.k8s.io=force-systemd-env-066220 --label created_by.minikube.sigs.k8s.io=true
	I1218 01:39:44.771068 1379211 oci.go:103] Successfully created a docker volume force-systemd-env-066220
	I1218 01:39:44.771177 1379211 cli_runner.go:164] Run: docker run --rm --name force-systemd-env-066220-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=force-systemd-env-066220 --entrypoint /usr/bin/test -v force-systemd-env-066220:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib
	I1218 01:39:45.404944 1379211 oci.go:107] Successfully prepared a docker volume force-systemd-env-066220
	I1218 01:39:45.405011 1379211 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 01:39:45.405021 1379211 kic.go:194] Starting extracting preloaded images to volume ...
	I1218 01:39:45.405095 1379211 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v force-systemd-env-066220:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir
	I1218 01:39:49.292207 1379211 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4:/preloaded.tar:ro -v force-systemd-env-066220:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir: (3.887056495s)
	I1218 01:39:49.292256 1379211 kic.go:203] duration metric: took 3.887231612s to extract preloaded images to volume ...
	W1218 01:39:49.292387 1379211 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1218 01:39:49.292488 1379211 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1218 01:39:49.342961 1379211 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname force-systemd-env-066220 --name force-systemd-env-066220 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=force-systemd-env-066220 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=force-systemd-env-066220 --network force-systemd-env-066220 --ip 192.168.85.2 --volume force-systemd-env-066220:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0
	I1218 01:39:49.635557 1379211 cli_runner.go:164] Run: docker container inspect force-systemd-env-066220 --format={{.State.Running}}
	I1218 01:39:49.657134 1379211 cli_runner.go:164] Run: docker container inspect force-systemd-env-066220 --format={{.State.Status}}
	I1218 01:39:49.683088 1379211 cli_runner.go:164] Run: docker exec force-systemd-env-066220 stat /var/lib/dpkg/alternatives/iptables
	I1218 01:39:49.759385 1379211 oci.go:144] the created container "force-systemd-env-066220" has a running status.
	I1218 01:39:49.759416 1379211 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa...
	I1218 01:39:50.156390 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa.pub -> /home/docker/.ssh/authorized_keys
	I1218 01:39:50.156479 1379211 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1218 01:39:50.186156 1379211 cli_runner.go:164] Run: docker container inspect force-systemd-env-066220 --format={{.State.Status}}
	I1218 01:39:50.210776 1379211 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1218 01:39:50.210797 1379211 kic_runner.go:114] Args: [docker exec --privileged force-systemd-env-066220 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1218 01:39:50.262951 1379211 cli_runner.go:164] Run: docker container inspect force-systemd-env-066220 --format={{.State.Status}}
	I1218 01:39:50.282418 1379211 machine.go:94] provisionDockerMachine start ...
	I1218 01:39:50.282514 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:50.300115 1379211 main.go:143] libmachine: Using SSH client type: native
	I1218 01:39:50.300520 1379211 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34180 <nil> <nil>}
	I1218 01:39:50.300537 1379211 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 01:39:50.301153 1379211 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:47328->127.0.0.1:34180: read: connection reset by peer
	I1218 01:39:53.459769 1379211 main.go:143] libmachine: SSH cmd err, output: <nil>: force-systemd-env-066220
	
	I1218 01:39:53.459814 1379211 ubuntu.go:182] provisioning hostname "force-systemd-env-066220"
	I1218 01:39:53.459878 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:53.478868 1379211 main.go:143] libmachine: Using SSH client type: native
	I1218 01:39:53.479200 1379211 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34180 <nil> <nil>}
	I1218 01:39:53.479217 1379211 main.go:143] libmachine: About to run SSH command:
	sudo hostname force-systemd-env-066220 && echo "force-systemd-env-066220" | sudo tee /etc/hostname
	I1218 01:39:53.643030 1379211 main.go:143] libmachine: SSH cmd err, output: <nil>: force-systemd-env-066220
	
	I1218 01:39:53.643109 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:53.662440 1379211 main.go:143] libmachine: Using SSH client type: native
	I1218 01:39:53.662754 1379211 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34180 <nil> <nil>}
	I1218 01:39:53.662771 1379211 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sforce-systemd-env-066220' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 force-systemd-env-066220/g' /etc/hosts;
				else 
					echo '127.0.1.1 force-systemd-env-066220' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 01:39:53.816575 1379211 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 01:39:53.816600 1379211 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 01:39:53.816633 1379211 ubuntu.go:190] setting up certificates
	I1218 01:39:53.816643 1379211 provision.go:84] configureAuth start
	I1218 01:39:53.816705 1379211 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" force-systemd-env-066220
	I1218 01:39:53.833390 1379211 provision.go:143] copyHostCerts
	I1218 01:39:53.833437 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 01:39:53.833481 1379211 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 01:39:53.833494 1379211 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 01:39:53.833579 1379211 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 01:39:53.833667 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 01:39:53.833692 1379211 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 01:39:53.833705 1379211 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 01:39:53.833734 1379211 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 01:39:53.833782 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 01:39:53.833802 1379211 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 01:39:53.833809 1379211 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 01:39:53.833834 1379211 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 01:39:53.833888 1379211 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.force-systemd-env-066220 san=[127.0.0.1 192.168.85.2 force-systemd-env-066220 localhost minikube]
	I1218 01:39:54.349776 1379211 provision.go:177] copyRemoteCerts
	I1218 01:39:54.349846 1379211 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 01:39:54.349902 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:54.366501 1379211 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34180 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa Username:docker}
	I1218 01:39:55.386677 1340150 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000318573s
	I1218 01:39:55.386711 1340150 kubeadm.go:319] 
	I1218 01:39:55.386801 1340150 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1218 01:39:55.386856 1340150 kubeadm.go:319] 	- The kubelet is not running
	I1218 01:39:55.386982 1340150 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1218 01:39:55.386992 1340150 kubeadm.go:319] 
	I1218 01:39:55.387097 1340150 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1218 01:39:55.387129 1340150 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1218 01:39:55.387160 1340150 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1218 01:39:55.387164 1340150 kubeadm.go:319] 
	I1218 01:39:55.391847 1340150 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1218 01:39:55.392362 1340150 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1218 01:39:55.392500 1340150 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1218 01:39:55.392761 1340150 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1218 01:39:55.392774 1340150 kubeadm.go:319] 
	I1218 01:39:55.392848 1340150 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1218 01:39:55.392914 1340150 kubeadm.go:403] duration metric: took 12m10.818401776s to StartCluster
	I1218 01:39:55.392964 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-apiserver Namespaces:[]}
	I1218 01:39:55.393031 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1218 01:39:55.457289 1340150 cri.go:89] found id: ""
	I1218 01:39:55.457323 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.457336 1340150 logs.go:284] No container was found matching "kube-apiserver"
	I1218 01:39:55.457343 1340150 cri.go:54] listing CRI containers in root : {State:all Name:etcd Namespaces:[]}
	I1218 01:39:55.457411 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1218 01:39:55.505387 1340150 cri.go:89] found id: ""
	I1218 01:39:55.505410 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.505419 1340150 logs.go:284] No container was found matching "etcd"
	I1218 01:39:55.505430 1340150 cri.go:54] listing CRI containers in root : {State:all Name:coredns Namespaces:[]}
	I1218 01:39:55.505538 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1218 01:39:55.546732 1340150 cri.go:89] found id: ""
	I1218 01:39:55.546756 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.546766 1340150 logs.go:284] No container was found matching "coredns"
	I1218 01:39:55.546780 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-scheduler Namespaces:[]}
	I1218 01:39:55.546838 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1218 01:39:55.578795 1340150 cri.go:89] found id: ""
	I1218 01:39:55.578821 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.578830 1340150 logs.go:284] No container was found matching "kube-scheduler"
	I1218 01:39:55.578837 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-proxy Namespaces:[]}
	I1218 01:39:55.578894 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1218 01:39:55.615657 1340150 cri.go:89] found id: ""
	I1218 01:39:55.615683 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.615693 1340150 logs.go:284] No container was found matching "kube-proxy"
	I1218 01:39:55.615699 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kube-controller-manager Namespaces:[]}
	I1218 01:39:55.615765 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1218 01:39:55.663057 1340150 cri.go:89] found id: ""
	I1218 01:39:55.663098 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.663107 1340150 logs.go:284] No container was found matching "kube-controller-manager"
	I1218 01:39:55.663114 1340150 cri.go:54] listing CRI containers in root : {State:all Name:kindnet Namespaces:[]}
	I1218 01:39:55.663175 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1218 01:39:55.697387 1340150 cri.go:89] found id: ""
	I1218 01:39:55.697414 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.697424 1340150 logs.go:284] No container was found matching "kindnet"
	I1218 01:39:55.697430 1340150 cri.go:54] listing CRI containers in root : {State:all Name:storage-provisioner Namespaces:[]}
	I1218 01:39:55.697493 1340150 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1218 01:39:55.728614 1340150 cri.go:89] found id: ""
	I1218 01:39:55.728661 1340150 logs.go:282] 0 containers: []
	W1218 01:39:55.728675 1340150 logs.go:284] No container was found matching "storage-provisioner"
	I1218 01:39:55.728695 1340150 logs.go:123] Gathering logs for CRI-O ...
	I1218 01:39:55.728746 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u crio -n 400"
	I1218 01:39:55.764321 1340150 logs.go:123] Gathering logs for container status ...
	I1218 01:39:55.764399 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1218 01:39:55.803982 1340150 logs.go:123] Gathering logs for kubelet ...
	I1218 01:39:55.804056 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1218 01:39:55.897653 1340150 logs.go:123] Gathering logs for dmesg ...
	I1218 01:39:55.897732 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1218 01:39:55.915692 1340150 logs.go:123] Gathering logs for describe nodes ...
	I1218 01:39:55.915717 1340150 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1218 01:39:56.022499 1340150 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	W1218 01:39:56.022578 1340150 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1218 01:39:56.022632 1340150 out.go:285] * 
	W1218 01:39:56.022747 1340150 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 01:39:56.022812 1340150 out.go:285] * 
	W1218 01:39:56.025404 1340150 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 01:39:56.031643 1340150 out.go:203] 
	W1218 01:39:56.034657 1340150 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000318573s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1218 01:39:56.034816 1340150 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1218 01:39:56.034877 1340150 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1218 01:39:56.038133 1340150 out.go:203] 
	I1218 01:39:54.471903 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1218 01:39:54.471971 1379211 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 01:39:54.488839 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1218 01:39:54.488908 1379211 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1218 01:39:54.505473 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1218 01:39:54.505536 1379211 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 01:39:54.523306 1379211 provision.go:87] duration metric: took 706.638462ms to configureAuth
	I1218 01:39:54.523334 1379211 ubuntu.go:206] setting minikube options for container-runtime
	I1218 01:39:54.523525 1379211 config.go:182] Loaded profile config "force-systemd-env-066220": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:39:54.523636 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:54.540346 1379211 main.go:143] libmachine: Using SSH client type: native
	I1218 01:39:54.540649 1379211 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34180 <nil> <nil>}
	I1218 01:39:54.540662 1379211 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 01:39:54.847444 1379211 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 01:39:54.847468 1379211 machine.go:97] duration metric: took 4.565028001s to provisionDockerMachine
	I1218 01:39:54.847479 1379211 client.go:176] duration metric: took 10.211726748s to LocalClient.Create
	I1218 01:39:54.847493 1379211 start.go:167] duration metric: took 10.211791838s to libmachine.API.Create "force-systemd-env-066220"
	I1218 01:39:54.847500 1379211 start.go:293] postStartSetup for "force-systemd-env-066220" (driver="docker")
	I1218 01:39:54.847510 1379211 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 01:39:54.847588 1379211 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 01:39:54.847639 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:54.865413 1379211 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34180 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa Username:docker}
	I1218 01:39:54.972480 1379211 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 01:39:54.975711 1379211 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 01:39:54.975739 1379211 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 01:39:54.975751 1379211 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 01:39:54.975810 1379211 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 01:39:54.975900 1379211 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 01:39:54.975912 1379211 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> /etc/ssl/certs/11595522.pem
	I1218 01:39:54.976023 1379211 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1218 01:39:54.983373 1379211 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:39:55.001790 1379211 start.go:296] duration metric: took 154.275164ms for postStartSetup
	I1218 01:39:55.002157 1379211 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" force-systemd-env-066220
	I1218 01:39:55.025572 1379211 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/force-systemd-env-066220/config.json ...
	I1218 01:39:55.025867 1379211 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:39:55.025930 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:55.042246 1379211 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34180 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa Username:docker}
	I1218 01:39:55.145290 1379211 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 01:39:55.150121 1379211 start.go:128] duration metric: took 10.518020003s to createHost
	I1218 01:39:55.150146 1379211 start.go:83] releasing machines lock for "force-systemd-env-066220", held for 10.518141222s
	I1218 01:39:55.150240 1379211 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" force-systemd-env-066220
	I1218 01:39:55.167295 1379211 ssh_runner.go:195] Run: cat /version.json
	I1218 01:39:55.167349 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:55.167646 1379211 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 01:39:55.167704 1379211 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-066220
	I1218 01:39:55.195905 1379211 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34180 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa Username:docker}
	I1218 01:39:55.200406 1379211 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34180 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/force-systemd-env-066220/id_rsa Username:docker}
	I1218 01:39:55.304031 1379211 ssh_runner.go:195] Run: systemctl --version
	I1218 01:39:55.403398 1379211 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 01:39:55.463080 1379211 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 01:39:55.473175 1379211 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 01:39:55.473241 1379211 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 01:39:55.511650 1379211 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1218 01:39:55.511671 1379211 start.go:496] detecting cgroup driver to use...
	I1218 01:39:55.511688 1379211 start.go:500] using "systemd" cgroup driver as enforced via flags
	I1218 01:39:55.511749 1379211 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 01:39:55.534604 1379211 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 01:39:55.549986 1379211 docker.go:218] disabling cri-docker service (if available) ...
	I1218 01:39:55.550092 1379211 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 01:39:55.569785 1379211 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 01:39:55.590371 1379211 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 01:39:55.776972 1379211 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 01:39:55.950327 1379211 docker.go:234] disabling docker service ...
	I1218 01:39:55.950415 1379211 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 01:39:55.981787 1379211 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 01:39:55.996188 1379211 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 01:39:56.251368 1379211 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 01:39:56.409328 1379211 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 01:39:56.429533 1379211 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 01:39:56.445081 1379211 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 01:39:56.445166 1379211 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.455575 1379211 crio.go:70] configuring cri-o to use "systemd" as cgroup driver...
	I1218 01:39:56.455637 1379211 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "systemd"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.464396 1379211 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.477912 1379211 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.487159 1379211 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 01:39:56.495547 1379211 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.505112 1379211 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.519973 1379211 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:39:56.528996 1379211 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 01:39:56.537260 1379211 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 01:39:56.545376 1379211 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:39:56.733479 1379211 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 01:39:56.933218 1379211 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 01:39:56.933295 1379211 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 01:39:56.940803 1379211 start.go:564] Will wait 60s for crictl version
	I1218 01:39:56.940871 1379211 ssh_runner.go:195] Run: which crictl
	I1218 01:39:56.946272 1379211 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 01:39:57.002982 1379211 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 01:39:57.003068 1379211 ssh_runner.go:195] Run: crio --version
	I1218 01:39:57.043785 1379211 ssh_runner.go:195] Run: crio --version
	I1218 01:39:57.081215 1379211 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	
	
	==> CRI-O <==
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.697385233Z" level=info msg="Registered SIGHUP reload watcher"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.697555263Z" level=info msg="Starting seccomp notifier watcher"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.697677131Z" level=info msg="Create NRI interface"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.697848785Z" level=info msg="built-in NRI default validator is disabled"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.698095547Z" level=info msg="runtime interface created"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.698132158Z" level=info msg="Registered domain \"k8s.io\" with NRI"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.698139993Z" level=info msg="runtime interface starting up..."
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.698146959Z" level=info msg="starting plugins..."
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.698164575Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 18 01:27:37 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:27:37.698239289Z" level=info msg="No systemd watchdog enabled"
	Dec 18 01:27:37 kubernetes-upgrade-823559 systemd[1]: Started crio.service - Container Runtime Interface for OCI (CRI-O).
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.750539123Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=f4298ab2-11f0-4ee5-8a6b-5a5d442e6e1f name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.751267972Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=19744357-18cd-4d1d-88e1-291f939123d0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.751894793Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=087d49e2-a99e-4124-b018-c8df7ffff596 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.75238195Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=41cc20bc-4eef-4ae6-9638-87e7e4ca066c name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.752901418Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=383469c0-af29-4900-8b42-019e545eb7b0 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.753479608Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=ed697c3e-f1a3-4393-8123-b2753ae7a103 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:31:50 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:31:50.753993136Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=d79c503d-abce-4fc5-b34d-8d3bd2525076 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.436208515Z" level=info msg="Checking image status: registry.k8s.io/kube-apiserver:v1.35.0-rc.1" id=949d13cd-2748-42e3-8ad7-6a1adf816a2b name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.436863175Z" level=info msg="Checking image status: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" id=7d6109fd-8697-4b72-a962-b8aad716b870 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.437438838Z" level=info msg="Checking image status: registry.k8s.io/kube-scheduler:v1.35.0-rc.1" id=df46538f-4905-40d6-a2f3-39e02173f9cc name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.437884872Z" level=info msg="Checking image status: registry.k8s.io/kube-proxy:v1.35.0-rc.1" id=77978b5e-b580-4c90-b031-6ebbf5eb6214 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.438270362Z" level=info msg="Checking image status: registry.k8s.io/coredns/coredns:v1.13.1" id=070c29e9-f0fb-43e2-918a-20ea167696f1 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.438704918Z" level=info msg="Checking image status: registry.k8s.io/pause:3.10.1" id=3ad540e6-9b7f-4492-82be-9b446cd55677 name=/runtime.v1.ImageService/ImageStatus
	Dec 18 01:35:53 kubernetes-upgrade-823559 crio[616]: time="2025-12-18T01:35:53.43919173Z" level=info msg="Checking image status: registry.k8s.io/etcd:3.6.6-0" id=5f2e33b0-1952-4148-9d45-051cafda4a13 name=/runtime.v1.ImageService/ImageStatus
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[ +34.067705] overlayfs: idmapped layers are currently not supported
	[Dec18 01:06] overlayfs: idmapped layers are currently not supported
	[Dec18 01:07] overlayfs: idmapped layers are currently not supported
	[  +2.867801] overlayfs: idmapped layers are currently not supported
	[Dec18 01:08] overlayfs: idmapped layers are currently not supported
	[Dec18 01:09] overlayfs: idmapped layers are currently not supported
	[Dec18 01:10] overlayfs: idmapped layers are currently not supported
	[Dec18 01:14] overlayfs: idmapped layers are currently not supported
	[Dec18 01:15] overlayfs: idmapped layers are currently not supported
	[Dec18 01:16] overlayfs: idmapped layers are currently not supported
	[ +41.843420] overlayfs: idmapped layers are currently not supported
	[Dec18 01:17] overlayfs: idmapped layers are currently not supported
	[Dec18 01:18] overlayfs: idmapped layers are currently not supported
	[Dec18 01:19] overlayfs: idmapped layers are currently not supported
	[  +7.804932] overlayfs: idmapped layers are currently not supported
	[Dec18 01:20] overlayfs: idmapped layers are currently not supported
	[ +26.176950] overlayfs: idmapped layers are currently not supported
	[Dec18 01:21] overlayfs: idmapped layers are currently not supported
	[ +26.122242] overlayfs: idmapped layers are currently not supported
	[Dec18 01:22] overlayfs: idmapped layers are currently not supported
	[Dec18 01:23] overlayfs: idmapped layers are currently not supported
	[Dec18 01:25] overlayfs: idmapped layers are currently not supported
	[Dec18 01:27] overlayfs: idmapped layers are currently not supported
	[Dec18 01:37] overlayfs: idmapped layers are currently not supported
	[Dec18 01:39] overlayfs: idmapped layers are currently not supported
	
	
	==> kernel <==
	 01:39:58 up  8:22,  0 user,  load average: 2.60, 1.88, 1.90
	Linux kubernetes-upgrade-823559 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:39:56 kubernetes-upgrade-823559 kubelet[12354]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:39:56 kubernetes-upgrade-823559 kubelet[12354]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:39:56 kubernetes-upgrade-823559 kubelet[12354]: E1218 01:39:56.250886   12354 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 964.
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:39:56 kubernetes-upgrade-823559 kubelet[12360]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:39:56 kubernetes-upgrade-823559 kubelet[12360]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:39:56 kubernetes-upgrade-823559 kubelet[12360]: E1218 01:39:56.976155   12360 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 01:39:56 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 01:39:57 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 965.
	Dec 18 01:39:57 kubernetes-upgrade-823559 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:39:57 kubernetes-upgrade-823559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:39:57 kubernetes-upgrade-823559 kubelet[12381]: Flag --cgroups-per-qos has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:39:57 kubernetes-upgrade-823559 kubelet[12381]: Flag --enforce-node-allocatable has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
	Dec 18 01:39:57 kubernetes-upgrade-823559 kubelet[12381]: E1218 01:39:57.710362   12381 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 18 01:39:57 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 18 01:39:57 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 18 01:39:58 kubernetes-upgrade-823559 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 966.
	Dec 18 01:39:58 kubernetes-upgrade-823559 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 18 01:39:58 kubernetes-upgrade-823559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-823559 -n kubernetes-upgrade-823559
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-823559 -n kubernetes-upgrade-823559: exit status 2 (407.570864ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "kubernetes-upgrade-823559" apiserver is not running, skipping kubectl commands (state="Stopped")
helpers_test.go:176: Cleaning up "kubernetes-upgrade-823559" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p kubernetes-upgrade-823559
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p kubernetes-upgrade-823559: (2.41647983s)
--- FAIL: TestKubernetesUpgrade (796.40s)

                                                
                                    
x
+
TestPause/serial/Pause (6.93s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-022448 --alsologtostderr -v=5
pause_test.go:110: (dbg) Non-zero exit: out/minikube-linux-arm64 pause -p pause-022448 --alsologtostderr -v=5: exit status 80 (2.481711991s)

                                                
                                                
-- stdout --
	* Pausing node pause-022448 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 01:38:48.253548 1373915 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:38:48.254335 1373915 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:38:48.254350 1373915 out.go:374] Setting ErrFile to fd 2...
	I1218 01:38:48.254356 1373915 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:38:48.254638 1373915 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:38:48.254909 1373915 out.go:368] Setting JSON to false
	I1218 01:38:48.254935 1373915 mustload.go:66] Loading cluster: pause-022448
	I1218 01:38:48.255449 1373915 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:48.255976 1373915 cli_runner.go:164] Run: docker container inspect pause-022448 --format={{.State.Status}}
	I1218 01:38:48.272257 1373915 host.go:66] Checking if "pause-022448" exists ...
	I1218 01:38:48.272583 1373915 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:38:48.325726 1373915 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:2 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:51 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-18 01:38:48.316589695 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:38:48.326481 1373915 pause.go:60] "namespaces" [kube-system kubernetes-dashboard istio-operator]="keys" map[addons:[] all:%!s(bool=false) apiserver-ips:[] apiserver-name:minikubeCA apiserver-names:[] apiserver-port:%!s(int=8443) auto-pause-interval:1m0s auto-update-drivers:%!s(bool=true) base-image:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 binary-mirror: bootstrapper:kubeadm cache-images:%!s(bool=true) cancel-scheduled:%!s(bool=false) cert-expiration:26280h0m0s cni: container-runtime: cpus:2 cri-socket: delete-on-failure:%!s(bool=false) disable-coredns-log:%!s(bool=false) disable-driver-mounts:%!s(bool=false) disable-metrics:%!s(bool=false) disable-optimizations:%!s(bool=false) disk-size:20000mb dns-domain:cluster.local dns-proxy:%!s(bool=false) docker-env:[] docker-opt:[] download-only:%!s(bool=false) driver: dry-run:%!s(bool=false) embed-certs:%!s(bool=false) embedcerts:%!s(bool=false) enable-default-
cni:%!s(bool=false) extra-config: extra-disks:%!s(int=0) feature-gates: force:%!s(bool=false) force-systemd:%!s(bool=false) gpus: ha:%!s(bool=false) host-dns-resolver:%!s(bool=true) host-only-cidr:192.168.59.1/24 host-only-nic-type:virtio hyperkit-vpnkit-sock: hyperkit-vsock-ports:[] hyperv-external-adapter: hyperv-use-external-switch:%!s(bool=false) hyperv-virtual-switch: image-mirror-country: image-repository: insecure-registry:[] install-addons:%!s(bool=true) interactive:%!s(bool=true) iso-url:[https://storage.googleapis.com/minikube-builds/iso/22186/minikube-v1.37.0-1765965980-22186-arm64.iso https://github.com/kubernetes/minikube/releases/download/v1.37.0-1765965980-22186/minikube-v1.37.0-1765965980-22186-arm64.iso https://kubernetes.oss-cn-hangzhou.aliyuncs.com/minikube/iso/minikube-v1.37.0-1765965980-22186-arm64.iso] keep-context:%!s(bool=false) keep-context-active:%!s(bool=false) kubernetes-version: kvm-gpu:%!s(bool=false) kvm-hidden:%!s(bool=false) kvm-network:default kvm-numa-count:%!s(int=1) kvm-qe
mu-uri:qemu:///system listen-address: maxauditentries:%!s(int=1000) memory: mount:%!s(bool=false) mount-9p-version:9p2000.L mount-gid:docker mount-ip: mount-msize:%!s(int=262144) mount-options:[] mount-port:0 mount-string: mount-type:9p mount-uid:docker namespace:default nat-nic-type:virtio native-ssh:%!s(bool=true) network: network-plugin: nfs-share:[] nfs-shares-root:/nfsshares no-kubernetes:%!s(bool=false) no-vtx-check:%!s(bool=false) nodes:%!s(int=1) output:text ports:[] preload:%!s(bool=true) profile:pause-022448 purge:%!s(bool=false) qemu-firmware-path: registry-mirror:[] reminderwaitperiodinhours:%!s(int=24) rootless:%!s(bool=false) schedule:0s service-cluster-ip-range:10.96.0.0/12 skip-audit:%!s(bool=false) socket-vmnet-client-path: socket-vmnet-path: ssh-ip-address: ssh-key: ssh-port:%!s(int=22) ssh-user:root static-ip: subnet: trace: user: uuid: vm:%!s(bool=false) vm-driver: wait:[apiserver system_pods] wait-timeout:6m0s wantnonedriverwarning:%!s(bool=true) wantupdatenotification:%!s(bool=true) want
virtualboxdriverwarning:%!s(bool=true)]="(MISSING)"
	I1218 01:38:48.329455 1373915 out.go:179] * Pausing node pause-022448 ... 
	I1218 01:38:48.333125 1373915 host.go:66] Checking if "pause-022448" exists ...
	I1218 01:38:48.333467 1373915 ssh_runner.go:195] Run: systemctl --version
	I1218 01:38:48.333521 1373915 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:48.350300 1373915 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:48.455181 1373915 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:38:48.467427 1373915 pause.go:52] kubelet running: true
	I1218 01:38:48.467520 1373915 ssh_runner.go:195] Run: sudo systemctl disable --now kubelet
	I1218 01:38:48.671228 1373915 cri.go:54] listing CRI containers in root : {State:running Name: Namespaces:[kube-system kubernetes-dashboard istio-operator]}
	I1218 01:38:48.671355 1373915 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system; crictl ps -a --quiet --label io.kubernetes.pod.namespace=kubernetes-dashboard; crictl ps -a --quiet --label io.kubernetes.pod.namespace=istio-operator"
	I1218 01:38:48.738500 1373915 cri.go:89] found id: "8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73"
	I1218 01:38:48.738522 1373915 cri.go:89] found id: "350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570"
	I1218 01:38:48.738527 1373915 cri.go:89] found id: "a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747"
	I1218 01:38:48.738531 1373915 cri.go:89] found id: "e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3"
	I1218 01:38:48.738535 1373915 cri.go:89] found id: "0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68"
	I1218 01:38:48.738538 1373915 cri.go:89] found id: "c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02"
	I1218 01:38:48.738542 1373915 cri.go:89] found id: "9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60"
	I1218 01:38:48.738545 1373915 cri.go:89] found id: "1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b"
	I1218 01:38:48.738548 1373915 cri.go:89] found id: "ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8"
	I1218 01:38:48.738555 1373915 cri.go:89] found id: "0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a"
	I1218 01:38:48.738558 1373915 cri.go:89] found id: "c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e"
	I1218 01:38:48.738562 1373915 cri.go:89] found id: "a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a"
	I1218 01:38:48.738565 1373915 cri.go:89] found id: "167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	I1218 01:38:48.738568 1373915 cri.go:89] found id: "db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed"
	I1218 01:38:48.738571 1373915 cri.go:89] found id: ""
	I1218 01:38:48.738620 1373915 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 01:38:48.749270 1373915 retry.go:31] will retry after 143.437447ms: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:48Z" level=error msg="open /run/runc: no such file or directory"
	I1218 01:38:48.893493 1373915 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:38:48.907821 1373915 pause.go:52] kubelet running: false
	I1218 01:38:48.907892 1373915 ssh_runner.go:195] Run: sudo systemctl disable --now kubelet
	I1218 01:38:49.064422 1373915 cri.go:54] listing CRI containers in root : {State:running Name: Namespaces:[kube-system kubernetes-dashboard istio-operator]}
	I1218 01:38:49.064510 1373915 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system; crictl ps -a --quiet --label io.kubernetes.pod.namespace=kubernetes-dashboard; crictl ps -a --quiet --label io.kubernetes.pod.namespace=istio-operator"
	I1218 01:38:49.134069 1373915 cri.go:89] found id: "8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73"
	I1218 01:38:49.134090 1373915 cri.go:89] found id: "350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570"
	I1218 01:38:49.134095 1373915 cri.go:89] found id: "a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747"
	I1218 01:38:49.134109 1373915 cri.go:89] found id: "e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3"
	I1218 01:38:49.134113 1373915 cri.go:89] found id: "0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68"
	I1218 01:38:49.134117 1373915 cri.go:89] found id: "c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02"
	I1218 01:38:49.134120 1373915 cri.go:89] found id: "9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60"
	I1218 01:38:49.134123 1373915 cri.go:89] found id: "1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b"
	I1218 01:38:49.134127 1373915 cri.go:89] found id: "ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8"
	I1218 01:38:49.134142 1373915 cri.go:89] found id: "0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a"
	I1218 01:38:49.134151 1373915 cri.go:89] found id: "c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e"
	I1218 01:38:49.134154 1373915 cri.go:89] found id: "a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a"
	I1218 01:38:49.134158 1373915 cri.go:89] found id: "167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	I1218 01:38:49.134161 1373915 cri.go:89] found id: "db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed"
	I1218 01:38:49.134164 1373915 cri.go:89] found id: ""
	I1218 01:38:49.134210 1373915 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 01:38:49.144670 1373915 retry.go:31] will retry after 525.471105ms: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:49Z" level=error msg="open /run/runc: no such file or directory"
	I1218 01:38:49.670418 1373915 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:38:49.683847 1373915 pause.go:52] kubelet running: false
	I1218 01:38:49.683913 1373915 ssh_runner.go:195] Run: sudo systemctl disable --now kubelet
	I1218 01:38:49.821368 1373915 cri.go:54] listing CRI containers in root : {State:running Name: Namespaces:[kube-system kubernetes-dashboard istio-operator]}
	I1218 01:38:49.821456 1373915 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system; crictl ps -a --quiet --label io.kubernetes.pod.namespace=kubernetes-dashboard; crictl ps -a --quiet --label io.kubernetes.pod.namespace=istio-operator"
	I1218 01:38:49.892892 1373915 cri.go:89] found id: "8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73"
	I1218 01:38:49.892913 1373915 cri.go:89] found id: "350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570"
	I1218 01:38:49.892918 1373915 cri.go:89] found id: "a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747"
	I1218 01:38:49.892922 1373915 cri.go:89] found id: "e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3"
	I1218 01:38:49.892925 1373915 cri.go:89] found id: "0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68"
	I1218 01:38:49.892935 1373915 cri.go:89] found id: "c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02"
	I1218 01:38:49.892967 1373915 cri.go:89] found id: "9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60"
	I1218 01:38:49.892977 1373915 cri.go:89] found id: "1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b"
	I1218 01:38:49.892987 1373915 cri.go:89] found id: "ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8"
	I1218 01:38:49.892993 1373915 cri.go:89] found id: "0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a"
	I1218 01:38:49.892996 1373915 cri.go:89] found id: "c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e"
	I1218 01:38:49.892999 1373915 cri.go:89] found id: "a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a"
	I1218 01:38:49.893003 1373915 cri.go:89] found id: "167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	I1218 01:38:49.893007 1373915 cri.go:89] found id: "db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed"
	I1218 01:38:49.893012 1373915 cri.go:89] found id: ""
	I1218 01:38:49.893067 1373915 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 01:38:49.904134 1373915 retry.go:31] will retry after 506.218267ms: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:49Z" level=error msg="open /run/runc: no such file or directory"
	I1218 01:38:50.410600 1373915 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:38:50.425186 1373915 pause.go:52] kubelet running: false
	I1218 01:38:50.425268 1373915 ssh_runner.go:195] Run: sudo systemctl disable --now kubelet
	I1218 01:38:50.571684 1373915 cri.go:54] listing CRI containers in root : {State:running Name: Namespaces:[kube-system kubernetes-dashboard istio-operator]}
	I1218 01:38:50.571820 1373915 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system; crictl ps -a --quiet --label io.kubernetes.pod.namespace=kubernetes-dashboard; crictl ps -a --quiet --label io.kubernetes.pod.namespace=istio-operator"
	I1218 01:38:50.634838 1373915 cri.go:89] found id: "8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73"
	I1218 01:38:50.634857 1373915 cri.go:89] found id: "350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570"
	I1218 01:38:50.634862 1373915 cri.go:89] found id: "a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747"
	I1218 01:38:50.634866 1373915 cri.go:89] found id: "e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3"
	I1218 01:38:50.634869 1373915 cri.go:89] found id: "0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68"
	I1218 01:38:50.634872 1373915 cri.go:89] found id: "c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02"
	I1218 01:38:50.634875 1373915 cri.go:89] found id: "9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60"
	I1218 01:38:50.634878 1373915 cri.go:89] found id: "1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b"
	I1218 01:38:50.634881 1373915 cri.go:89] found id: "ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8"
	I1218 01:38:50.634887 1373915 cri.go:89] found id: "0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a"
	I1218 01:38:50.634890 1373915 cri.go:89] found id: "c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e"
	I1218 01:38:50.634893 1373915 cri.go:89] found id: "a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a"
	I1218 01:38:50.634897 1373915 cri.go:89] found id: "167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	I1218 01:38:50.634900 1373915 cri.go:89] found id: "db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed"
	I1218 01:38:50.634903 1373915 cri.go:89] found id: ""
	I1218 01:38:50.634950 1373915 ssh_runner.go:195] Run: sudo runc list -f json
	I1218 01:38:50.649042 1373915 out.go:203] 
	W1218 01:38:50.651901 1373915 out.go:285] X Exiting due to GUEST_PAUSE: Pause: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:50Z" level=error msg="open /run/runc: no such file or directory"
	
	X Exiting due to GUEST_PAUSE: Pause: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:50Z" level=error msg="open /run/runc: no such file or directory"
	
	W1218 01:38:50.651923 1373915 out.go:285] * 
	* 
	W1218 01:38:50.660375 1373915 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_pause_49fdaea37aad8ebccb761973c21590cc64efe8d9_0.log                   │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_pause_49fdaea37aad8ebccb761973c21590cc64efe8d9_0.log                   │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1218 01:38:50.663301 1373915 out.go:203] 

                                                
                                                
** /stderr **
pause_test.go:112: failed to pause minikube with args: "out/minikube-linux-arm64 pause -p pause-022448 --alsologtostderr -v=5" : exit status 80
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestPause/serial/Pause]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestPause/serial/Pause]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect pause-022448
helpers_test.go:244: (dbg) docker inspect pause-022448:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443",
	        "Created": "2025-12-18T01:37:33.636968388Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1370224,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T01:37:33.696677184Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/hostname",
	        "HostsPath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/hosts",
	        "LogPath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443-json.log",
	        "Name": "/pause-022448",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "pause-022448:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "pause-022448",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443",
	                "LowerDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc/merged",
	                "UpperDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc/diff",
	                "WorkDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "pause-022448",
	                "Source": "/var/lib/docker/volumes/pause-022448/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "pause-022448",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "pause-022448",
	                "name.minikube.sigs.k8s.io": "pause-022448",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "43f2c939981ad4e85c64ef4a2af00c4e228079cff99bb209764229851f7874a0",
	            "SandboxKey": "/var/run/docker/netns/43f2c939981a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34170"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34171"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34174"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34172"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34173"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "pause-022448": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.85.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "ba:12:aa:5f:cd:c8",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "832e1b8b28b9ef274f679f0160b0827750e9cd6343166763adaf998d26deedb6",
	                    "EndpointID": "fbd81b37a63ec55e79c692c4f04d24ec7c650d0dd040afca26e0684cacd1571a",
	                    "Gateway": "192.168.85.1",
	                    "IPAddress": "192.168.85.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "pause-022448",
	                        "bc3df9cf646c"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p pause-022448 -n pause-022448
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p pause-022448 -n pause-022448: exit status 2 (338.484048ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestPause/serial/Pause FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestPause/serial/Pause]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p pause-022448 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p pause-022448 logs -n 25: (1.35454263s)
helpers_test.go:261: TestPause/serial/Pause logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                     ARGS                                                                      │          PROFILE          │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -p NoKubernetes-573547 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio                                         │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:25 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p missing-upgrade-381437 --memory=3072 --driver=docker  --container-runtime=crio                                                             │ missing-upgrade-381437    │ jenkins │ v1.35.0 │ 18 Dec 25 01:25 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio                         │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ delete  │ -p NoKubernetes-573547                                                                                                                        │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio                         │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ ssh     │ -p NoKubernetes-573547 sudo systemctl is-active --quiet service kubelet                                                                       │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │                     │
	│ start   │ -p missing-upgrade-381437 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                      │ missing-upgrade-381437    │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:27 UTC │
	│ stop    │ -p NoKubernetes-573547                                                                                                                        │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p NoKubernetes-573547 --driver=docker  --container-runtime=crio                                                                              │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ ssh     │ -p NoKubernetes-573547 sudo systemctl is-active --quiet service kubelet                                                                       │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │                     │
	│ delete  │ -p NoKubernetes-573547                                                                                                                        │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio      │ kubernetes-upgrade-823559 │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:27 UTC │
	│ delete  │ -p missing-upgrade-381437                                                                                                                     │ missing-upgrade-381437    │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ start   │ -p stopped-upgrade-156815 --memory=3072 --vm-driver=docker  --container-runtime=crio                                                          │ stopped-upgrade-156815    │ jenkins │ v1.35.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ stop    │ -p kubernetes-upgrade-823559                                                                                                                  │ kubernetes-upgrade-823559 │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ start   │ -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio │ kubernetes-upgrade-823559 │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │                     │
	│ stop    │ stopped-upgrade-156815 stop                                                                                                                   │ stopped-upgrade-156815    │ jenkins │ v1.35.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ start   │ -p stopped-upgrade-156815 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                      │ stopped-upgrade-156815    │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:32 UTC │
	│ delete  │ -p stopped-upgrade-156815                                                                                                                     │ stopped-upgrade-156815    │ jenkins │ v1.37.0 │ 18 Dec 25 01:32 UTC │ 18 Dec 25 01:32 UTC │
	│ start   │ -p running-upgrade-850997 --memory=3072 --vm-driver=docker  --container-runtime=crio                                                          │ running-upgrade-850997    │ jenkins │ v1.35.0 │ 18 Dec 25 01:32 UTC │ 18 Dec 25 01:33 UTC │
	│ start   │ -p running-upgrade-850997 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                      │ running-upgrade-850997    │ jenkins │ v1.37.0 │ 18 Dec 25 01:33 UTC │ 18 Dec 25 01:37 UTC │
	│ delete  │ -p running-upgrade-850997                                                                                                                     │ running-upgrade-850997    │ jenkins │ v1.37.0 │ 18 Dec 25 01:37 UTC │ 18 Dec 25 01:37 UTC │
	│ start   │ -p pause-022448 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=crio                                     │ pause-022448              │ jenkins │ v1.37.0 │ 18 Dec 25 01:37 UTC │ 18 Dec 25 01:38 UTC │
	│ start   │ -p pause-022448 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                                              │ pause-022448              │ jenkins │ v1.37.0 │ 18 Dec 25 01:38 UTC │ 18 Dec 25 01:38 UTC │
	│ pause   │ -p pause-022448 --alsologtostderr -v=5                                                                                                        │ pause-022448              │ jenkins │ v1.37.0 │ 18 Dec 25 01:38 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 01:38:20
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 01:38:20.178210 1372607 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:38:20.178403 1372607 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:38:20.178434 1372607 out.go:374] Setting ErrFile to fd 2...
	I1218 01:38:20.178457 1372607 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:38:20.178920 1372607 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:38:20.179442 1372607 out.go:368] Setting JSON to false
	I1218 01:38:20.180617 1372607 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":30049,"bootTime":1765991852,"procs":198,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 01:38:20.180755 1372607 start.go:143] virtualization:  
	I1218 01:38:20.183852 1372607 out.go:179] * [pause-022448] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 01:38:20.186054 1372607 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 01:38:20.186120 1372607 notify.go:221] Checking for updates...
	I1218 01:38:20.191884 1372607 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 01:38:20.194771 1372607 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:38:20.197710 1372607 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 01:38:20.200688 1372607 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 01:38:20.203636 1372607 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 01:38:20.207011 1372607 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:20.207574 1372607 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 01:38:20.237905 1372607 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 01:38:20.238026 1372607 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:38:20.294866 1372607 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:2 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:51 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-18 01:38:20.284860013 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:38:20.294972 1372607 docker.go:319] overlay module found
	I1218 01:38:20.298058 1372607 out.go:179] * Using the docker driver based on existing profile
	I1218 01:38:20.300956 1372607 start.go:309] selected driver: docker
	I1218 01:38:20.300981 1372607 start.go:927] validating driver "docker" against &{Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false regi
stry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:38:20.301115 1372607 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 01:38:20.301236 1372607 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:38:20.359258 1372607 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:2 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:51 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-18 01:38:20.350476276 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:38:20.359647 1372607 cni.go:84] Creating CNI manager for ""
	I1218 01:38:20.359701 1372607 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:38:20.359753 1372607 start.go:353] cluster config:
	{Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:c
rio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false
storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:38:20.362991 1372607 out.go:179] * Starting "pause-022448" primary control-plane node in "pause-022448" cluster
	I1218 01:38:20.365720 1372607 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 01:38:20.368622 1372607 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 01:38:20.371409 1372607 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 01:38:20.371454 1372607 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 01:38:20.371466 1372607 cache.go:65] Caching tarball of preloaded images
	I1218 01:38:20.371488 1372607 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 01:38:20.371554 1372607 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 01:38:20.371565 1372607 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 01:38:20.371695 1372607 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/config.json ...
	I1218 01:38:20.395485 1372607 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 01:38:20.395504 1372607 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 01:38:20.395517 1372607 cache.go:243] Successfully downloaded all kic artifacts
	I1218 01:38:20.395555 1372607 start.go:360] acquireMachinesLock for pause-022448: {Name:mked9a692720255ef16f316861ffd9a1e4f4fa5c Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 01:38:20.395610 1372607 start.go:364] duration metric: took 37.078µs to acquireMachinesLock for "pause-022448"
	I1218 01:38:20.395630 1372607 start.go:96] Skipping create...Using existing machine configuration
	I1218 01:38:20.395635 1372607 fix.go:54] fixHost starting: 
	I1218 01:38:20.395885 1372607 cli_runner.go:164] Run: docker container inspect pause-022448 --format={{.State.Status}}
	I1218 01:38:20.423500 1372607 fix.go:112] recreateIfNeeded on pause-022448: state=Running err=<nil>
	W1218 01:38:20.423528 1372607 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 01:38:20.426614 1372607 out.go:252] * Updating the running docker "pause-022448" container ...
	I1218 01:38:20.426651 1372607 machine.go:94] provisionDockerMachine start ...
	I1218 01:38:20.426754 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:20.452585 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:20.452912 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:20.452921 1372607 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 01:38:20.607980 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-022448
	
	I1218 01:38:20.608019 1372607 ubuntu.go:182] provisioning hostname "pause-022448"
	I1218 01:38:20.608086 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:20.625625 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:20.625934 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:20.625951 1372607 main.go:143] libmachine: About to run SSH command:
	sudo hostname pause-022448 && echo "pause-022448" | sudo tee /etc/hostname
	I1218 01:38:20.790700 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-022448
	
	I1218 01:38:20.790777 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:20.810422 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:20.810742 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:20.810764 1372607 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-022448' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-022448/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-022448' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 01:38:20.968737 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 01:38:20.968765 1372607 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 01:38:20.968793 1372607 ubuntu.go:190] setting up certificates
	I1218 01:38:20.968802 1372607 provision.go:84] configureAuth start
	I1218 01:38:20.968875 1372607 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-022448
	I1218 01:38:20.986662 1372607 provision.go:143] copyHostCerts
	I1218 01:38:20.986743 1372607 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 01:38:20.986756 1372607 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 01:38:20.986831 1372607 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 01:38:20.986934 1372607 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 01:38:20.986943 1372607 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 01:38:20.986970 1372607 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 01:38:20.987026 1372607 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 01:38:20.987034 1372607 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 01:38:20.987057 1372607 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 01:38:20.987104 1372607 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.pause-022448 san=[127.0.0.1 192.168.85.2 localhost minikube pause-022448]
	I1218 01:38:21.213810 1372607 provision.go:177] copyRemoteCerts
	I1218 01:38:21.213877 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 01:38:21.213915 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:21.231304 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:21.341140 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 01:38:21.359392 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I1218 01:38:21.377862 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 01:38:21.397159 1372607 provision.go:87] duration metric: took 428.329775ms to configureAuth
	I1218 01:38:21.397185 1372607 ubuntu.go:206] setting minikube options for container-runtime
	I1218 01:38:21.397415 1372607 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:21.397533 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:21.422373 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:21.422766 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:21.422788 1372607 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 01:38:26.806476 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 01:38:26.806497 1372607 machine.go:97] duration metric: took 6.379838428s to provisionDockerMachine
	I1218 01:38:26.806509 1372607 start.go:293] postStartSetup for "pause-022448" (driver="docker")
	I1218 01:38:26.806534 1372607 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 01:38:26.806601 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 01:38:26.806651 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:26.824156 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:26.933046 1372607 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 01:38:26.936496 1372607 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 01:38:26.936522 1372607 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 01:38:26.936533 1372607 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 01:38:26.936586 1372607 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 01:38:26.936668 1372607 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 01:38:26.936783 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1218 01:38:26.944358 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:38:26.960952 1372607 start.go:296] duration metric: took 154.42761ms for postStartSetup
	I1218 01:38:26.961029 1372607 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:38:26.961071 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:26.977821 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:27.081422 1372607 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 01:38:27.086061 1372607 fix.go:56] duration metric: took 6.690419014s for fixHost
	I1218 01:38:27.086083 1372607 start.go:83] releasing machines lock for "pause-022448", held for 6.690463508s
	I1218 01:38:27.086153 1372607 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-022448
	I1218 01:38:27.102359 1372607 ssh_runner.go:195] Run: cat /version.json
	I1218 01:38:27.102414 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:27.102714 1372607 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 01:38:27.102772 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:27.122653 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:27.123202 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:27.324693 1372607 ssh_runner.go:195] Run: systemctl --version
	I1218 01:38:27.330660 1372607 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 01:38:27.369263 1372607 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 01:38:27.373436 1372607 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 01:38:27.373541 1372607 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 01:38:27.381343 1372607 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 01:38:27.381368 1372607 start.go:496] detecting cgroup driver to use...
	I1218 01:38:27.381398 1372607 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 01:38:27.381444 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 01:38:27.396156 1372607 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 01:38:27.409071 1372607 docker.go:218] disabling cri-docker service (if available) ...
	I1218 01:38:27.409134 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 01:38:27.424734 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 01:38:27.437481 1372607 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 01:38:27.562532 1372607 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 01:38:27.699809 1372607 docker.go:234] disabling docker service ...
	I1218 01:38:27.699879 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 01:38:27.714713 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 01:38:27.727295 1372607 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 01:38:27.878545 1372607 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 01:38:28.044551 1372607 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 01:38:28.057546 1372607 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 01:38:28.072139 1372607 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 01:38:28.072205 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.081781 1372607 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 01:38:28.081898 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.091291 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.100508 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.109712 1372607 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 01:38:28.117680 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.126856 1372607 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.135450 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.144368 1372607 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 01:38:28.151799 1372607 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 01:38:28.159436 1372607 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:38:28.297009 1372607 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 01:38:28.543730 1372607 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 01:38:28.543849 1372607 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 01:38:28.547461 1372607 start.go:564] Will wait 60s for crictl version
	I1218 01:38:28.547530 1372607 ssh_runner.go:195] Run: which crictl
	I1218 01:38:28.550761 1372607 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 01:38:28.578902 1372607 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 01:38:28.579066 1372607 ssh_runner.go:195] Run: crio --version
	I1218 01:38:28.609579 1372607 ssh_runner.go:195] Run: crio --version
	I1218 01:38:28.642456 1372607 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 01:38:28.645402 1372607 cli_runner.go:164] Run: docker network inspect pause-022448 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 01:38:28.663331 1372607 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1218 01:38:28.667663 1372607 kubeadm.go:884] updating cluster {Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerName
s:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false regist
ry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 01:38:28.667812 1372607 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 01:38:28.667876 1372607 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 01:38:28.713646 1372607 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 01:38:28.713673 1372607 crio.go:433] Images already preloaded, skipping extraction
	I1218 01:38:28.713741 1372607 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 01:38:28.741356 1372607 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 01:38:28.741376 1372607 cache_images.go:86] Images are preloaded, skipping loading
	I1218 01:38:28.741384 1372607 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.34.3 crio true true} ...
	I1218 01:38:28.741483 1372607 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=pause-022448 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 01:38:28.741573 1372607 ssh_runner.go:195] Run: crio config
	I1218 01:38:28.813820 1372607 cni.go:84] Creating CNI manager for ""
	I1218 01:38:28.813843 1372607 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:38:28.813865 1372607 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 01:38:28.813887 1372607 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-022448 NodeName:pause-022448 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernete
s/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 01:38:28.814019 1372607 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "pause-022448"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 01:38:28.814098 1372607 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 01:38:28.821849 1372607 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 01:38:28.821967 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 01:38:28.829115 1372607 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (362 bytes)
	I1218 01:38:28.841124 1372607 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 01:38:28.852722 1372607 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2209 bytes)
	I1218 01:38:28.864281 1372607 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1218 01:38:28.867940 1372607 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:38:29.000376 1372607 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 01:38:29.014502 1372607 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448 for IP: 192.168.85.2
	I1218 01:38:29.014533 1372607 certs.go:195] generating shared ca certs ...
	I1218 01:38:29.014549 1372607 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:38:29.014722 1372607 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 01:38:29.014781 1372607 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 01:38:29.014792 1372607 certs.go:257] generating profile certs ...
	I1218 01:38:29.014905 1372607 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.key
	I1218 01:38:29.014989 1372607 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/apiserver.key.6e282b41
	I1218 01:38:29.015047 1372607 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/proxy-client.key
	I1218 01:38:29.015172 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 01:38:29.015222 1372607 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 01:38:29.015235 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 01:38:29.015280 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 01:38:29.015307 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 01:38:29.015343 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 01:38:29.015392 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:38:29.016071 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 01:38:29.034081 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 01:38:29.051630 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 01:38:29.068750 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 01:38:29.085542 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1218 01:38:29.102863 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 01:38:29.120574 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 01:38:29.138843 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 01:38:29.156856 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 01:38:29.175221 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 01:38:29.194997 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 01:38:29.216317 1372607 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 01:38:29.228656 1372607 ssh_runner.go:195] Run: openssl version
	I1218 01:38:29.234937 1372607 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.242063 1372607 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 01:38:29.249045 1372607 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.252658 1372607 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.252720 1372607 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.294221 1372607 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 01:38:29.301641 1372607 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.309837 1372607 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 01:38:29.317485 1372607 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.321249 1372607 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.321311 1372607 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.362444 1372607 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 01:38:29.369840 1372607 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.377151 1372607 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 01:38:29.384343 1372607 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.399595 1372607 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.399739 1372607 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.512585 1372607 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 01:38:29.539512 1372607 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 01:38:29.549234 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 01:38:29.669085 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 01:38:29.784103 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 01:38:29.847217 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 01:38:29.900574 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 01:38:29.954328 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 01:38:30.017688 1372607 kubeadm.go:401] StartCluster: {Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[
] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-
aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:38:30.017911 1372607 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 01:38:30.018018 1372607 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 01:38:30.079383 1372607 cri.go:89] found id: "8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73"
	I1218 01:38:30.079453 1372607 cri.go:89] found id: "350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570"
	I1218 01:38:30.079469 1372607 cri.go:89] found id: "a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747"
	I1218 01:38:30.079487 1372607 cri.go:89] found id: "e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3"
	I1218 01:38:30.079504 1372607 cri.go:89] found id: "0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68"
	I1218 01:38:30.079537 1372607 cri.go:89] found id: "c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02"
	I1218 01:38:30.079553 1372607 cri.go:89] found id: "9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60"
	I1218 01:38:30.079572 1372607 cri.go:89] found id: "1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b"
	I1218 01:38:30.079602 1372607 cri.go:89] found id: "ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8"
	I1218 01:38:30.079628 1372607 cri.go:89] found id: "0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a"
	I1218 01:38:30.079646 1372607 cri.go:89] found id: "c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e"
	I1218 01:38:30.079664 1372607 cri.go:89] found id: "a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a"
	I1218 01:38:30.079697 1372607 cri.go:89] found id: "167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	I1218 01:38:30.079714 1372607 cri.go:89] found id: "db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed"
	I1218 01:38:30.079731 1372607 cri.go:89] found id: ""
	I1218 01:38:30.079812 1372607 ssh_runner.go:195] Run: sudo runc list -f json
	W1218 01:38:30.099483 1372607 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:30Z" level=error msg="open /run/runc: no such file or directory"
	I1218 01:38:30.099643 1372607 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 01:38:30.112772 1372607 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 01:38:30.112834 1372607 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 01:38:30.112927 1372607 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 01:38:30.125157 1372607 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 01:38:30.125986 1372607 kubeconfig.go:125] found "pause-022448" server: "https://192.168.85.2:8443"
	I1218 01:38:30.126977 1372607 kapi.go:59] client config for pause-022448: &rest.Config{Host:"https://192.168.85.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:
[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 01:38:30.127855 1372607 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 01:38:30.127993 1372607 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 01:38:30.128023 1372607 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 01:38:30.128048 1372607 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 01:38:30.128081 1372607 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 01:38:30.128490 1372607 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 01:38:30.162975 1372607 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.85.2
	I1218 01:38:30.163056 1372607 kubeadm.go:602] duration metric: took 50.194643ms to restartPrimaryControlPlane
	I1218 01:38:30.163112 1372607 kubeadm.go:403] duration metric: took 145.405713ms to StartCluster
	I1218 01:38:30.163148 1372607 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:38:30.163239 1372607 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:38:30.164256 1372607 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:38:30.164556 1372607 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 01:38:30.165091 1372607 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 01:38:30.165390 1372607 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:30.169702 1372607 out.go:179] * Enabled addons: 
	I1218 01:38:30.169831 1372607 out.go:179] * Verifying Kubernetes components...
	I1218 01:38:30.172591 1372607 addons.go:530] duration metric: took 7.493704ms for enable addons: enabled=[]
	I1218 01:38:30.172769 1372607 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:38:30.450580 1372607 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 01:38:30.465846 1372607 node_ready.go:35] waiting up to 6m0s for node "pause-022448" to be "Ready" ...
	I1218 01:38:34.774950 1372607 node_ready.go:49] node "pause-022448" is "Ready"
	I1218 01:38:34.774976 1372607 node_ready.go:38] duration metric: took 4.309029376s for node "pause-022448" to be "Ready" ...
	I1218 01:38:34.774989 1372607 api_server.go:52] waiting for apiserver process to appear ...
	I1218 01:38:34.775044 1372607 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:38:34.792761 1372607 api_server.go:72] duration metric: took 4.628151076s to wait for apiserver process to appear ...
	I1218 01:38:34.792783 1372607 api_server.go:88] waiting for apiserver healthz status ...
	I1218 01:38:34.792802 1372607 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1218 01:38:34.823449 1372607 api_server.go:279] https://192.168.85.2:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[-]poststarthook/start-apiextensions-controllers failed: reason withheld
	[-]poststarthook/crd-informer-synced failed: reason withheld
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[-]poststarthook/start-service-ip-repair-controllers failed: reason withheld
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[-]poststarthook/priority-and-fairness-config-producer failed: reason withheld
	[-]poststarthook/bootstrap-controller failed: reason withheld
	[-]poststarthook/start-kubernetes-service-cidr-controller failed: reason withheld
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[-]poststarthook/apiservice-registration-controller failed: reason withheld
	[-]poststarthook/apiservice-discovery-controller failed: reason withheld
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1218 01:38:34.823530 1372607 api_server.go:103] status: https://192.168.85.2:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[-]poststarthook/start-apiextensions-controllers failed: reason withheld
	[-]poststarthook/crd-informer-synced failed: reason withheld
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[-]poststarthook/start-service-ip-repair-controllers failed: reason withheld
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[-]poststarthook/priority-and-fairness-config-producer failed: reason withheld
	[-]poststarthook/bootstrap-controller failed: reason withheld
	[-]poststarthook/start-kubernetes-service-cidr-controller failed: reason withheld
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[-]poststarthook/apiservice-registration-controller failed: reason withheld
	[-]poststarthook/apiservice-discovery-controller failed: reason withheld
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1218 01:38:35.293183 1372607 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1218 01:38:35.301869 1372607 api_server.go:279] https://192.168.85.2:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1218 01:38:35.301898 1372607 api_server.go:103] status: https://192.168.85.2:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1218 01:38:35.793366 1372607 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1218 01:38:35.801387 1372607 api_server.go:279] https://192.168.85.2:8443/healthz returned 200:
	ok
	I1218 01:38:35.803455 1372607 api_server.go:141] control plane version: v1.34.3
	I1218 01:38:35.803515 1372607 api_server.go:131] duration metric: took 1.010724282s to wait for apiserver health ...
	I1218 01:38:35.803539 1372607 system_pods.go:43] waiting for kube-system pods to appear ...
	I1218 01:38:35.809938 1372607 system_pods.go:59] 7 kube-system pods found
	I1218 01:38:35.810022 1372607 system_pods.go:61] "coredns-66bc5c9577-dlr4v" [c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 01:38:35.810046 1372607 system_pods.go:61] "etcd-pause-022448" [c4afcbf9-5fc8-4e08-a983-02f03fd32c82] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1218 01:38:35.810067 1372607 system_pods.go:61] "kindnet-dgxx6" [ba231964-7cab-44f1-9f75-137449bd092a] Running
	I1218 01:38:35.810091 1372607 system_pods.go:61] "kube-apiserver-pause-022448" [c7b45649-bbb0-44a5-9adc-6fc3f68c528c] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1218 01:38:35.810113 1372607 system_pods.go:61] "kube-controller-manager-pause-022448" [1006641e-db44-4894-a967-88876508da33] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1218 01:38:35.810133 1372607 system_pods.go:61] "kube-proxy-5cwsd" [3e34e4da-78fd-4bff-907a-b2ae772e0726] Running
	I1218 01:38:35.810154 1372607 system_pods.go:61] "kube-scheduler-pause-022448" [d5553265-5da5-49ec-92bb-a0f9fd9747b1] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1218 01:38:35.810175 1372607 system_pods.go:74] duration metric: took 6.617126ms to wait for pod list to return data ...
	I1218 01:38:35.810197 1372607 default_sa.go:34] waiting for default service account to be created ...
	I1218 01:38:35.811941 1372607 default_sa.go:45] found service account: "default"
	I1218 01:38:35.811995 1372607 default_sa.go:55] duration metric: took 1.779772ms for default service account to be created ...
	I1218 01:38:35.812019 1372607 system_pods.go:116] waiting for k8s-apps to be running ...
	I1218 01:38:35.815337 1372607 system_pods.go:86] 7 kube-system pods found
	I1218 01:38:35.815408 1372607 system_pods.go:89] "coredns-66bc5c9577-dlr4v" [c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 01:38:35.815442 1372607 system_pods.go:89] "etcd-pause-022448" [c4afcbf9-5fc8-4e08-a983-02f03fd32c82] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1218 01:38:35.815474 1372607 system_pods.go:89] "kindnet-dgxx6" [ba231964-7cab-44f1-9f75-137449bd092a] Running
	I1218 01:38:35.815502 1372607 system_pods.go:89] "kube-apiserver-pause-022448" [c7b45649-bbb0-44a5-9adc-6fc3f68c528c] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1218 01:38:35.815524 1372607 system_pods.go:89] "kube-controller-manager-pause-022448" [1006641e-db44-4894-a967-88876508da33] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1218 01:38:35.815555 1372607 system_pods.go:89] "kube-proxy-5cwsd" [3e34e4da-78fd-4bff-907a-b2ae772e0726] Running
	I1218 01:38:35.815582 1372607 system_pods.go:89] "kube-scheduler-pause-022448" [d5553265-5da5-49ec-92bb-a0f9fd9747b1] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1218 01:38:35.815604 1372607 system_pods.go:126] duration metric: took 3.565401ms to wait for k8s-apps to be running ...
	I1218 01:38:35.815636 1372607 system_svc.go:44] waiting for kubelet service to be running ....
	I1218 01:38:35.815724 1372607 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:38:35.838788 1372607 system_svc.go:56] duration metric: took 23.144353ms WaitForService to wait for kubelet
	I1218 01:38:35.838870 1372607 kubeadm.go:587] duration metric: took 5.674262921s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 01:38:35.838913 1372607 node_conditions.go:102] verifying NodePressure condition ...
	I1218 01:38:35.850393 1372607 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1218 01:38:35.850425 1372607 node_conditions.go:123] node cpu capacity is 2
	I1218 01:38:35.850437 1372607 node_conditions.go:105] duration metric: took 11.50504ms to run NodePressure ...
	I1218 01:38:35.850459 1372607 start.go:242] waiting for startup goroutines ...
	I1218 01:38:35.850467 1372607 start.go:247] waiting for cluster config update ...
	I1218 01:38:35.850475 1372607 start.go:256] writing updated cluster config ...
	I1218 01:38:35.850774 1372607 ssh_runner.go:195] Run: rm -f paused
	I1218 01:38:35.855768 1372607 pod_ready.go:37] extra waiting up to 4m0s for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 01:38:35.856513 1372607 kapi.go:59] client config for pause-022448: &rest.Config{Host:"https://192.168.85.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:
[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 01:38:35.859695 1372607 pod_ready.go:83] waiting for pod "coredns-66bc5c9577-dlr4v" in "kube-system" namespace to be "Ready" or be gone ...
	W1218 01:38:37.865094 1372607 pod_ready.go:104] pod "coredns-66bc5c9577-dlr4v" is not "Ready", error: <nil>
	W1218 01:38:39.865432 1372607 pod_ready.go:104] pod "coredns-66bc5c9577-dlr4v" is not "Ready", error: <nil>
	I1218 01:38:41.864793 1372607 pod_ready.go:94] pod "coredns-66bc5c9577-dlr4v" is "Ready"
	I1218 01:38:41.864822 1372607 pod_ready.go:86] duration metric: took 6.005102516s for pod "coredns-66bc5c9577-dlr4v" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:41.867162 1372607 pod_ready.go:83] waiting for pod "etcd-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:42.372657 1372607 pod_ready.go:94] pod "etcd-pause-022448" is "Ready"
	I1218 01:38:42.372685 1372607 pod_ready.go:86] duration metric: took 505.496847ms for pod "etcd-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:42.375317 1372607 pod_ready.go:83] waiting for pod "kube-apiserver-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	W1218 01:38:44.381276 1372607 pod_ready.go:104] pod "kube-apiserver-pause-022448" is not "Ready", error: <nil>
	W1218 01:38:46.880868 1372607 pod_ready.go:104] pod "kube-apiserver-pause-022448" is not "Ready", error: <nil>
	I1218 01:38:47.880871 1372607 pod_ready.go:94] pod "kube-apiserver-pause-022448" is "Ready"
	I1218 01:38:47.880896 1372607 pod_ready.go:86] duration metric: took 5.505553093s for pod "kube-apiserver-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.883236 1372607 pod_ready.go:83] waiting for pod "kube-controller-manager-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.888151 1372607 pod_ready.go:94] pod "kube-controller-manager-pause-022448" is "Ready"
	I1218 01:38:47.888179 1372607 pod_ready.go:86] duration metric: took 4.915285ms for pod "kube-controller-manager-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.890525 1372607 pod_ready.go:83] waiting for pod "kube-proxy-5cwsd" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.895132 1372607 pod_ready.go:94] pod "kube-proxy-5cwsd" is "Ready"
	I1218 01:38:47.895158 1372607 pod_ready.go:86] duration metric: took 4.608948ms for pod "kube-proxy-5cwsd" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.897552 1372607 pod_ready.go:83] waiting for pod "kube-scheduler-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:48.079772 1372607 pod_ready.go:94] pod "kube-scheduler-pause-022448" is "Ready"
	I1218 01:38:48.079806 1372607 pod_ready.go:86] duration metric: took 182.18767ms for pod "kube-scheduler-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:48.079818 1372607 pod_ready.go:40] duration metric: took 12.224019823s for extra waiting for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 01:38:48.134352 1372607 start.go:625] kubectl: 1.33.2, cluster: 1.34.3 (minor skew: 1)
	I1218 01:38:48.138956 1372607 out.go:179] * Done! kubectl is now configured to use "pause-022448" cluster and "default" namespace by default
	
	
	==> CRI-O <==
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.650285846Z" level=info msg="Started container" PID=2328 containerID=c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02 description=kube-system/kube-proxy-5cwsd/kube-proxy id=455d2865-bc3f-4508-8cab-dd78d768beb6 name=/runtime.v1.RuntimeService/StartContainer sandboxID=000b371f75dec4868286d3a04b8ec41966c40f32a8cd0123350ef1b3407753a1
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.685034241Z" level=info msg="Created container 350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570: kube-system/kube-scheduler-pause-022448/kube-scheduler" id=65361021-b6bd-4665-a7f9-7bf088ec46f9 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.686352454Z" level=info msg="Starting container: 350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570" id=9684a66e-03d9-411c-8466-26275b2b05e3 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.69242197Z" level=info msg="Created container e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3: kube-system/kindnet-dgxx6/kindnet-cni" id=92766cea-f357-4985-8162-73bfdf4aac35 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.696483783Z" level=info msg="Starting container: e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3" id=1cf4cb4d-ab3d-4df3-aacf-698fad58d397 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.704138541Z" level=info msg="Started container" PID=2374 containerID=350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570 description=kube-system/kube-scheduler-pause-022448/kube-scheduler id=9684a66e-03d9-411c-8466-26275b2b05e3 name=/runtime.v1.RuntimeService/StartContainer sandboxID=208cbf5a8b1beff70580cec37ac914fc2acca76de0ab701057fcec0b96433d14
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.714577545Z" level=info msg="Started container" PID=2357 containerID=e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3 description=kube-system/kindnet-dgxx6/kindnet-cni id=1cf4cb4d-ab3d-4df3-aacf-698fad58d397 name=/runtime.v1.RuntimeService/StartContainer sandboxID=7e3192171b757ca842418cf43e843866c352c7b7e9769061a932689239b82c3f
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.770785887Z" level=info msg="Created container 8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73: kube-system/etcd-pause-022448/etcd" id=b2cd9fc2-4dc9-44d7-9b95-da3cc2697336 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.771462864Z" level=info msg="Starting container: 8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73" id=fc973e48-a765-4851-ba72-d8b15e5a3e27 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.773490095Z" level=info msg="Started container" PID=2388 containerID=8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73 description=kube-system/etcd-pause-022448/etcd id=fc973e48-a765-4851-ba72-d8b15e5a3e27 name=/runtime.v1.RuntimeService/StartContainer sandboxID=34e2e5f0c28e0469728c35fc292cb7ef48476fd2edb359e390c3102ab8f5fa4f
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.146035063Z" level=info msg="CNI monitoring event CREATE        \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15051887Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.150554988Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15057989Z" level=info msg="CNI monitoring event WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15393696Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.153975843Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.153999769Z" level=info msg="CNI monitoring event WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.157511485Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15754702Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.157569575Z" level=info msg="CNI monitoring event RENAME        \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.161719279Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.161759836Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.161786297Z" level=info msg="CNI monitoring event CREATE        \"/etc/cni/net.d/10-kindnet.conflist\" ← \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.16517832Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.165216448Z" level=info msg="Updated default CNI network name to kindnet"
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                    NAMESPACE
	8a80f1b4ef9e2       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42                                     21 seconds ago      Running             etcd                      1                   34e2e5f0c28e0       etcd-pause-022448                      kube-system
	350ef333bdeea       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6                                     22 seconds ago      Running             kube-scheduler            1                   208cbf5a8b1be       kube-scheduler-pause-022448            kube-system
	a94767c237847       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896                                     22 seconds ago      Running             kube-apiserver            1                   91ec02eee6371       kube-apiserver-pause-022448            kube-system
	e0a20554f5b0e       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13                                     22 seconds ago      Running             kindnet-cni               1                   7e3192171b757       kindnet-dgxx6                          kube-system
	0162b54e23cb5       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc                                     22 seconds ago      Running             coredns                   1                   e8347e2183b9b       coredns-66bc5c9577-dlr4v               kube-system
	c0914c54d5ff2       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162                                     22 seconds ago      Running             kube-proxy                1                   000b371f75dec       kube-proxy-5cwsd                       kube-system
	9b6ce068b25af       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22                                     22 seconds ago      Running             kube-controller-manager   1                   1ab511191087d       kube-controller-manager-pause-022448   kube-system
	1ac523005a68f       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc                                     34 seconds ago      Exited              coredns                   0                   e8347e2183b9b       coredns-66bc5c9577-dlr4v               kube-system
	ad0758ba82bfa       docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3   45 seconds ago      Exited              kindnet-cni               0                   7e3192171b757       kindnet-dgxx6                          kube-system
	0cefb58714e9a       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162                                     46 seconds ago      Exited              kube-proxy                0                   000b371f75dec       kube-proxy-5cwsd                       kube-system
	c9db25fe09d87       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896                                     58 seconds ago      Exited              kube-apiserver            0                   91ec02eee6371       kube-apiserver-pause-022448            kube-system
	a1287f0a1dfc7       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22                                     58 seconds ago      Exited              kube-controller-manager   0                   1ab511191087d       kube-controller-manager-pause-022448   kube-system
	167bdfb1e99e7       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42                                     58 seconds ago      Exited              etcd                      0                   34e2e5f0c28e0       etcd-pause-022448                      kube-system
	db6702726086b       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6                                     58 seconds ago      Exited              kube-scheduler            0                   208cbf5a8b1be       kube-scheduler-pause-022448            kube-system
	
	
	==> coredns [0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = fa9a0cdcdddcb4be74a0eaf7cfcb211c40e29ddf5507e03bbfc0065bade31f0f2641a2513136e246f32328dd126fc93236fb5c595246f0763926a524386705e8
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:55089 - 22251 "HINFO IN 3841722693218782860.4026371041655979999. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.026485187s
	
	
	==> coredns [1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = fa9a0cdcdddcb4be74a0eaf7cfcb211c40e29ddf5507e03bbfc0065bade31f0f2641a2513136e246f32328dd126fc93236fb5c595246f0763926a524386705e8
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:45272 - 33276 "HINFO IN 3661192616598581439.7880941379114653225. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.011616725s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               pause-022448
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=pause-022448
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=2e96f676eb7e96389e85fe0658a4ede4c4ba6924
	                    minikube.k8s.io/name=pause-022448
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_12_18T01_38_00_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 18 Dec 2025 01:37:56 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-022448
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 18 Dec 2025 01:38:45 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:37:52 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:37:52 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:37:52 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:38:16 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.85.2
	  Hostname:    pause-022448
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	System Info:
	  Machine ID:                 02ff784b806e34735a6e229a69428228
	  System UUID:                a02f202d-16ca-4496-9551-42799335e19c
	  Boot ID:                    57207cc2-434a-4297-a7b8-47b6fa2e7487
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  cri-o://1.34.3
	  Kubelet Version:            v1.34.3
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                    CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                    ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-66bc5c9577-dlr4v                100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     47s
	  kube-system                 etcd-pause-022448                       100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         52s
	  kube-system                 kindnet-dgxx6                           100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      47s
	  kube-system                 kube-apiserver-pause-022448             250m (12%)    0 (0%)      0 (0%)           0 (0%)         52s
	  kube-system                 kube-controller-manager-pause-022448    200m (10%)    0 (0%)      0 (0%)           0 (0%)         52s
	  kube-system                 kube-proxy-5cwsd                        0 (0%)        0 (0%)      0 (0%)           0 (0%)         47s
	  kube-system                 kube-scheduler-pause-022448             100m (5%)     0 (0%)      0 (0%)           0 (0%)         52s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age   From             Message
	  ----     ------                   ----  ----             -------
	  Normal   Starting                 46s   kube-proxy       
	  Normal   Starting                 16s   kube-proxy       
	  Normal   Starting                 52s   kubelet          Starting kubelet.
	  Warning  CgroupV1                 52s   kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  52s   kubelet          Node pause-022448 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    52s   kubelet          Node pause-022448 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     52s   kubelet          Node pause-022448 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           48s   node-controller  Node pause-022448 event: Registered Node pause-022448 in Controller
	  Normal   NodeReady                35s   kubelet          Node pause-022448 status is now: NodeReady
	  Normal   RegisteredNode           14s   node-controller  Node pause-022448 event: Registered Node pause-022448 in Controller
	
	
	==> dmesg <==
	[  +3.441233] overlayfs: idmapped layers are currently not supported
	[ +34.067705] overlayfs: idmapped layers are currently not supported
	[Dec18 01:06] overlayfs: idmapped layers are currently not supported
	[Dec18 01:07] overlayfs: idmapped layers are currently not supported
	[  +2.867801] overlayfs: idmapped layers are currently not supported
	[Dec18 01:08] overlayfs: idmapped layers are currently not supported
	[Dec18 01:09] overlayfs: idmapped layers are currently not supported
	[Dec18 01:10] overlayfs: idmapped layers are currently not supported
	[Dec18 01:14] overlayfs: idmapped layers are currently not supported
	[Dec18 01:15] overlayfs: idmapped layers are currently not supported
	[Dec18 01:16] overlayfs: idmapped layers are currently not supported
	[ +41.843420] overlayfs: idmapped layers are currently not supported
	[Dec18 01:17] overlayfs: idmapped layers are currently not supported
	[Dec18 01:18] overlayfs: idmapped layers are currently not supported
	[Dec18 01:19] overlayfs: idmapped layers are currently not supported
	[  +7.804932] overlayfs: idmapped layers are currently not supported
	[Dec18 01:20] overlayfs: idmapped layers are currently not supported
	[ +26.176950] overlayfs: idmapped layers are currently not supported
	[Dec18 01:21] overlayfs: idmapped layers are currently not supported
	[ +26.122242] overlayfs: idmapped layers are currently not supported
	[Dec18 01:22] overlayfs: idmapped layers are currently not supported
	[Dec18 01:23] overlayfs: idmapped layers are currently not supported
	[Dec18 01:25] overlayfs: idmapped layers are currently not supported
	[Dec18 01:27] overlayfs: idmapped layers are currently not supported
	[Dec18 01:37] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99] <==
	{"level":"warn","ts":"2025-12-18T01:37:55.449346Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49916","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.464947Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49924","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.483624Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49944","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.512841Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49954","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.532836Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49972","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.548521Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49976","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.633154Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:50004","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-12-18T01:38:21.598370Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-12-18T01:38:21.598432Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"pause-022448","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.85.2:2380"],"advertise-client-urls":["https://192.168.85.2:2379"]}
	{"level":"error","ts":"2025-12-18T01:38:21.598534Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T01:38:21.875763Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T01:38:21.875866Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T01:38:21.875888Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"9f0758e1c58a86ed","current-leader-member-id":"9f0758e1c58a86ed"}
	{"level":"info","ts":"2025-12-18T01:38:21.875993Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"info","ts":"2025-12-18T01:38:21.876010Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876052Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876123Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T01:38:21.876159Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876258Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.85.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876277Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.85.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T01:38:21.876284Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.85.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T01:38:21.879047Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.85.2:2380"}
	{"level":"error","ts":"2025-12-18T01:38:21.879121Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.85.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T01:38:21.879158Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.85.2:2380"}
	{"level":"info","ts":"2025-12-18T01:38:21.879165Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"pause-022448","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.85.2:2380"],"advertise-client-urls":["https://192.168.85.2:2379"]}
	
	
	==> etcd [8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73] <==
	{"level":"warn","ts":"2025-12-18T01:38:33.434992Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60562","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.453542Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60574","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.477288Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60592","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.489475Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60614","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.532162Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60632","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.533303Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60646","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.549757Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60668","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.572818Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60676","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.590340Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60692","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.613977Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60710","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.631338Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60728","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.643929Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60748","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.674419Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60760","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.683109Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60766","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.698828Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60784","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.714424Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60810","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.729104Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60826","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.745525Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60858","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.763810Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60878","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.777515Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60894","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.793995Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60912","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.833347Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60924","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.860299Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60956","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.874000Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60980","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.929208Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:32780","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 01:38:51 up  8:21,  0 user,  load average: 2.60, 1.74, 1.86
	Linux pause-022448 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8] <==
	I1218 01:38:06.628124       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 01:38:06.628550       1 main.go:139] hostIP = 192.168.85.2
	podIP = 192.168.85.2
	I1218 01:38:06.628684       1 main.go:148] setting mtu 1500 for CNI 
	I1218 01:38:06.628696       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 01:38:06.628708       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T01:38:06Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1218 01:38:06.828981       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 01:38:06.829167       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 01:38:06.829206       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 01:38:06.832035       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1218 01:38:07.031989       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1218 01:38:07.032076       1 metrics.go:72] Registering metrics
	I1218 01:38:07.032155       1 controller.go:711] "Syncing nftables rules"
	I1218 01:38:16.832903       1 main.go:297] Handling node with IPs: map[192.168.85.2:{}]
	I1218 01:38:16.832958       1 main.go:301] handling current node
	
	
	==> kindnet [e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3] <==
	I1218 01:38:29.877429       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 01:38:29.877780       1 main.go:139] hostIP = 192.168.85.2
	podIP = 192.168.85.2
	I1218 01:38:29.892469       1 main.go:148] setting mtu 1500 for CNI 
	I1218 01:38:29.896246       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 01:38:29.896297       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T01:38:30Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1218 01:38:30.145929       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 01:38:30.146016       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 01:38:30.146051       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 01:38:30.150995       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1218 01:38:34.858241       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1218 01:38:34.858278       1 metrics.go:72] Registering metrics
	I1218 01:38:34.858349       1 controller.go:711] "Syncing nftables rules"
	I1218 01:38:40.145640       1 main.go:297] Handling node with IPs: map[192.168.85.2:{}]
	I1218 01:38:40.145705       1 main.go:301] handling current node
	I1218 01:38:50.145780       1 main.go:297] Handling node with IPs: map[192.168.85.2:{}]
	I1218 01:38:50.145816       1 main.go:301] handling current node
	
	
	==> kube-apiserver [a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747] <==
	I1218 01:38:34.853946       1 cache.go:39] Caches are synced for LocalAvailability controller
	I1218 01:38:34.854178       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I1218 01:38:34.854196       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I1218 01:38:34.854330       1 shared_informer.go:356] "Caches are synced" controller="configmaps"
	I1218 01:38:34.854551       1 handler_discovery.go:451] Starting ResourceDiscoveryManager
	I1218 01:38:34.854647       1 shared_informer.go:356] "Caches are synced" controller="node_authorizer"
	I1218 01:38:34.866953       1 shared_informer.go:356] "Caches are synced" controller="kubernetes-service-cidr-controller"
	I1218 01:38:34.867070       1 default_servicecidr_controller.go:137] Shutting down kubernetes-service-cidr-controller
	I1218 01:38:34.872554       1 shared_informer.go:356] "Caches are synced" controller="*generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]"
	I1218 01:38:34.872598       1 policy_source.go:240] refreshing policies
	I1218 01:38:34.872642       1 shared_informer.go:356] "Caches are synced" controller="crd-autoregister"
	I1218 01:38:34.872683       1 shared_informer.go:356] "Caches are synced" controller="ipallocator-repair-controller"
	I1218 01:38:34.872802       1 aggregator.go:171] initial CRD sync complete...
	I1218 01:38:34.872817       1 autoregister_controller.go:144] Starting autoregister controller
	I1218 01:38:34.872823       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I1218 01:38:34.872829       1 cache.go:39] Caches are synced for autoregister controller
	E1218 01:38:34.876537       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I1218 01:38:34.882066       1 cidrallocator.go:301] created ClusterIP allocator for Service CIDR 10.96.0.0/12
	I1218 01:38:34.894320       1 controller.go:667] quota admission added evaluator for: leases.coordination.k8s.io
	I1218 01:38:35.451790       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1218 01:38:35.786384       1 controller.go:667] quota admission added evaluator for: serviceaccounts
	I1218 01:38:37.230921       1 controller.go:667] quota admission added evaluator for: replicasets.apps
	I1218 01:38:37.278976       1 controller.go:667] quota admission added evaluator for: endpoints
	I1218 01:38:37.529216       1 controller.go:667] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I1218 01:38:37.581168       1 controller.go:667] quota admission added evaluator for: deployments.apps
	
	
	==> kube-apiserver [c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e] <==
	W1218 01:38:21.618173       1 logging.go:55] [core] [Channel #243 SubChannel #245]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618253       1 logging.go:55] [core] [Channel #55 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617380       1 logging.go:55] [core] [Channel #27 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618369       1 logging.go:55] [core] [Channel #103 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617507       1 logging.go:55] [core] [Channel #247 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618474       1 logging.go:55] [core] [Channel #143 SubChannel #145]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617637       1 logging.go:55] [core] [Channel #75 SubChannel #77]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618604       1 logging.go:55] [core] [Channel #147 SubChannel #149]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617958       1 logging.go:55] [core] [Channel #163 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618689       1 logging.go:55] [core] [Channel #215 SubChannel #217]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618141       1 logging.go:55] [core] [Channel #195 SubChannel #197]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618266       1 logging.go:55] [core] [Channel #67 SubChannel #69]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.616649       1 logging.go:55] [core] [Channel #139 SubChannel #141]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618337       1 logging.go:55] [core] [Channel #135 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618433       1 logging.go:55] [core] [Channel #119 SubChannel #121]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618532       1 logging.go:55] [core] [Channel #175 SubChannel #177]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619078       1 logging.go:55] [core] [Channel #251 SubChannel #253]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619217       1 logging.go:55] [core] [Channel #59 SubChannel #61]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619296       1 logging.go:55] [core] [Channel #115 SubChannel #117]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619368       1 logging.go:55] [core] [Channel #123 SubChannel #125]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619624       1 logging.go:55] [core] [Channel #35 SubChannel #37]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619724       1 logging.go:55] [core] [Channel #87 SubChannel #89]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619780       1 logging.go:55] [core] [Channel #131 SubChannel #133]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619860       1 logging.go:55] [core] [Channel #11 SubChannel #14]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.620165       1 logging.go:55] [core] [Channel #231 SubChannel #233]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60] <==
	I1218 01:38:37.173522       1 shared_informer.go:356] "Caches are synced" controller="ReplicationController"
	I1218 01:38:37.175993       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1218 01:38:37.177351       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1218 01:38:37.179025       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1218 01:38:37.181244       1 shared_informer.go:356] "Caches are synced" controller="crt configmap"
	I1218 01:38:37.182438       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1218 01:38:37.185584       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1218 01:38:37.189775       1 shared_informer.go:356] "Caches are synced" controller="node"
	I1218 01:38:37.189826       1 range_allocator.go:177] "Sending events to api server" logger="node-ipam-controller"
	I1218 01:38:37.189849       1 range_allocator.go:183] "Starting range CIDR allocator" logger="node-ipam-controller"
	I1218 01:38:37.189854       1 shared_informer.go:349] "Waiting for caches to sync" controller="cidrallocator"
	I1218 01:38:37.189859       1 shared_informer.go:356] "Caches are synced" controller="cidrallocator"
	I1218 01:38:37.193332       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1218 01:38:37.195677       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1218 01:38:37.198909       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1218 01:38:37.201152       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1218 01:38:37.203389       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 01:38:37.207522       1 shared_informer.go:356] "Caches are synced" controller="namespace"
	I1218 01:38:37.216881       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 01:38:37.216912       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1218 01:38:37.216922       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1218 01:38:37.220292       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1218 01:38:37.222604       1 shared_informer.go:356] "Caches are synced" controller="stateful set"
	I1218 01:38:37.222661       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice"
	I1218 01:38:37.223074       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	
	
	==> kube-controller-manager [a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a] <==
	I1218 01:38:03.441618       1 shared_informer.go:349] "Waiting for caches to sync" controller="cidrallocator"
	I1218 01:38:03.441662       1 shared_informer.go:356] "Caches are synced" controller="cidrallocator"
	I1218 01:38:03.442084       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1218 01:38:03.451774       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1218 01:38:03.456331       1 shared_informer.go:356] "Caches are synced" controller="ReplicaSet"
	I1218 01:38:03.457418       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 01:38:03.457950       1 range_allocator.go:428] "Set node PodCIDR" logger="node-ipam-controller" node="pause-022448" podCIDRs=["10.244.0.0/24"]
	I1218 01:38:03.464302       1 shared_informer.go:356] "Caches are synced" controller="service-cidr-controller"
	I1218 01:38:03.470810       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1218 01:38:03.478863       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1218 01:38:03.478947       1 shared_informer.go:356] "Caches are synced" controller="stateful set"
	I1218 01:38:03.479003       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1218 01:38:03.479063       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1218 01:38:03.479084       1 shared_informer.go:356] "Caches are synced" controller="daemon sets"
	I1218 01:38:03.479117       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1218 01:38:03.479144       1 shared_informer.go:356] "Caches are synced" controller="deployment"
	I1218 01:38:03.479159       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1218 01:38:03.481493       1 shared_informer.go:356] "Caches are synced" controller="expand"
	I1218 01:38:03.481735       1 shared_informer.go:356] "Caches are synced" controller="HPA"
	I1218 01:38:03.481745       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1218 01:38:03.481755       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1218 01:38:03.481764       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1218 01:38:03.488044       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1218 01:38:03.499113       1 shared_informer.go:356] "Caches are synced" controller="cronjob"
	I1218 01:38:18.431653       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	
	
	==> kube-proxy [0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a] <==
	I1218 01:38:04.854025       1 server_linux.go:53] "Using iptables proxy"
	I1218 01:38:04.937684       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1218 01:38:05.037853       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1218 01:38:05.037960       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.85.2"]
	E1218 01:38:05.038167       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1218 01:38:05.059416       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1218 01:38:05.059536       1 server_linux.go:132] "Using iptables Proxier"
	I1218 01:38:05.063681       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1218 01:38:05.063981       1 server.go:527] "Version info" version="v1.34.3"
	I1218 01:38:05.064004       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 01:38:05.065523       1 config.go:200] "Starting service config controller"
	I1218 01:38:05.065547       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1218 01:38:05.065568       1 config.go:106] "Starting endpoint slice config controller"
	I1218 01:38:05.065572       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1218 01:38:05.065596       1 config.go:403] "Starting serviceCIDR config controller"
	I1218 01:38:05.065691       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1218 01:38:05.066305       1 config.go:309] "Starting node config controller"
	I1218 01:38:05.066323       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1218 01:38:05.066329       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1218 01:38:05.166335       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1218 01:38:05.166401       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1218 01:38:05.166545       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-proxy [c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02] <==
	I1218 01:38:33.315668       1 server_linux.go:53] "Using iptables proxy"
	I1218 01:38:33.794499       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1218 01:38:34.895701       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1218 01:38:34.895733       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.85.2"]
	E1218 01:38:34.895832       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1218 01:38:34.934437       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1218 01:38:34.935131       1 server_linux.go:132] "Using iptables Proxier"
	I1218 01:38:34.948973       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1218 01:38:34.949303       1 server.go:527] "Version info" version="v1.34.3"
	I1218 01:38:34.949482       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 01:38:34.951700       1 config.go:200] "Starting service config controller"
	I1218 01:38:34.951764       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1218 01:38:34.951807       1 config.go:106] "Starting endpoint slice config controller"
	I1218 01:38:34.951835       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1218 01:38:34.951871       1 config.go:403] "Starting serviceCIDR config controller"
	I1218 01:38:34.951896       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1218 01:38:34.952678       1 config.go:309] "Starting node config controller"
	I1218 01:38:34.952749       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1218 01:38:34.952780       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1218 01:38:35.052744       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1218 01:38:35.053526       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1218 01:38:35.053552       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570] <==
	I1218 01:38:32.710955       1 serving.go:386] Generated self-signed cert in-memory
	W1218 01:38:34.770003       1 requestheader_controller.go:204] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W1218 01:38:34.770119       1 authentication.go:397] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W1218 01:38:34.770156       1 authentication.go:398] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1218 01:38:34.770198       1 authentication.go:399] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1218 01:38:34.838850       1 server.go:175] "Starting Kubernetes Scheduler" version="v1.34.3"
	I1218 01:38:34.838957       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 01:38:34.841033       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:34.841118       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:34.841576       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	I1218 01:38:34.841680       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 01:38:34.942329       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kube-scheduler [db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed] <==
	E1218 01:37:56.545224       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 01:37:56.545269       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 01:37:56.545308       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 01:37:56.545351       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 01:37:56.545413       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 01:37:56.545460       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 01:37:56.545504       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 01:37:56.545546       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 01:37:56.545636       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 01:37:56.545687       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 01:37:56.545722       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 01:37:56.544962       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 01:37:57.367785       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 01:37:57.378572       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 01:37:57.504599       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 01:37:57.525795       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 01:37:57.539666       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 01:37:57.905852       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1218 01:37:59.794816       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:21.594489       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1218 01:38:21.594512       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1218 01:38:21.594532       1 tlsconfig.go:258] "Shutting down DynamicServingCertificateController"
	I1218 01:38:21.594560       1 configmap_cafile_content.go:226] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:21.594713       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1218 01:38:21.594733       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kubelet <==
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497421    1316 controller.go:195] "Failed to update lease" err="Put \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497646    1316 controller.go:195] "Failed to update lease" err="Put \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497874    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-dlr4v\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505" pod="kube-system/coredns-66bc5c9577-dlr4v"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497952    1316 controller.go:195] "Failed to update lease" err="Put \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: I1218 01:38:29.497974    1316 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.498190    1316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused" interval="200ms"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: I1218 01:38:29.509269    1316 scope.go:117] "RemoveContainer" containerID="167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.509770    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-proxy-5cwsd\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="3e34e4da-78fd-4bff-907a-b2ae772e0726" pod="kube-system/kube-proxy-5cwsd"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.509929    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-dlr4v\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505" pod="kube-system/coredns-66bc5c9577-dlr4v"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510075    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="ea3387825836e7fd100b14a1e723ead4" pod="kube-system/kube-controller-manager-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510218    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="9399ade1e13df4d0bf1cb0e0daf33ecc" pod="kube-system/kube-scheduler-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510360    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/etcd-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="4bd8d366a1589788e68113f96a17c367" pod="kube-system/etcd-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510527    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="3bee34e012a9123c51f981d1d256b412" pod="kube-system/kube-apiserver-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510682    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kindnet-dgxx6\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="ba231964-7cab-44f1-9f75-137449bd092a" pod="kube-system/kindnet-dgxx6"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.698894    1316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused" interval="400ms"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.693944    1316 status_manager.go:1018] "Failed to get status for pod" err="pods \"kube-proxy-5cwsd\" is forbidden: User \"system:node:pause-022448\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" podUID="3e34e4da-78fd-4bff-907a-b2ae772e0726" pod="kube-system/kube-proxy-5cwsd"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.694628    1316 reflector.go:205] "Failed to watch" err="configmaps \"coredns\" is forbidden: User \"system:node:pause-022448\" cannot watch resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"coredns\"" type="*v1.ConfigMap"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.694683    1316 reflector.go:205] "Failed to watch" err="configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:pause-022448\" cannot watch resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.694708    1316 reflector.go:205] "Failed to watch" err="configmaps \"kube-proxy\" is forbidden: User \"system:node:pause-022448\" cannot watch resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"kube-proxy\"" type="*v1.ConfigMap"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.765576    1316 status_manager.go:1018] "Failed to get status for pod" err="pods \"coredns-66bc5c9577-dlr4v\" is forbidden: User \"system:node:pause-022448\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" podUID="c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505" pod="kube-system/coredns-66bc5c9577-dlr4v"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.793956    1316 status_manager.go:1018] "Failed to get status for pod" err="pods \"kube-controller-manager-pause-022448\" is forbidden: User \"system:node:pause-022448\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" podUID="ea3387825836e7fd100b14a1e723ead4" pod="kube-system/kube-controller-manager-pause-022448"
	Dec 18 01:38:39 pause-022448 kubelet[1316]: W1218 01:38:39.369468    1316 conversion.go:112] Could not get instant cpu stats: cumulative stats decrease
	Dec 18 01:38:48 pause-022448 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent...
	Dec 18 01:38:48 pause-022448 systemd[1]: kubelet.service: Deactivated successfully.
	Dec 18 01:38:48 pause-022448 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p pause-022448 -n pause-022448
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p pause-022448 -n pause-022448: exit status 2 (367.653996ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:270: (dbg) Run:  kubectl --context pause-022448 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:294: <<< TestPause/serial/Pause FAILED: end of post-mortem logs <<<
helpers_test.go:295: ---------------------/post-mortem---------------------------------
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestPause/serial/Pause]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestPause/serial/Pause]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect pause-022448
helpers_test.go:244: (dbg) docker inspect pause-022448:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443",
	        "Created": "2025-12-18T01:37:33.636968388Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1370224,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-18T01:37:33.696677184Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/hostname",
	        "HostsPath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/hosts",
	        "LogPath": "/var/lib/docker/containers/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443/bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443-json.log",
	        "Name": "/pause-022448",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "pause-022448:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "pause-022448",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "bc3df9cf646ce7528eb3e7e0bfa6a46d03b7522b75464b9a965a6fe219583443",
	                "LowerDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc-init/diff:/var/lib/docker/overlay2/7b805f61ea9056099e29eaf620faabe57a79e0038b5dac8d955ed702c0e90167/diff",
	                "MergedDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc/merged",
	                "UpperDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc/diff",
	                "WorkDir": "/var/lib/docker/overlay2/9f47356294693338f385017b6f8007c03699af175ba449ba5aa05658189af9bc/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "pause-022448",
	                "Source": "/var/lib/docker/volumes/pause-022448/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "pause-022448",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "pause-022448",
	                "name.minikube.sigs.k8s.io": "pause-022448",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "43f2c939981ad4e85c64ef4a2af00c4e228079cff99bb209764229851f7874a0",
	            "SandboxKey": "/var/run/docker/netns/43f2c939981a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34170"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34171"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34174"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34172"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34173"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "pause-022448": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.85.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "ba:12:aa:5f:cd:c8",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "832e1b8b28b9ef274f679f0160b0827750e9cd6343166763adaf998d26deedb6",
	                    "EndpointID": "fbd81b37a63ec55e79c692c4f04d24ec7c650d0dd040afca26e0684cacd1571a",
	                    "Gateway": "192.168.85.1",
	                    "IPAddress": "192.168.85.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "pause-022448",
	                        "bc3df9cf646c"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p pause-022448 -n pause-022448
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p pause-022448 -n pause-022448: exit status 2 (343.971722ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestPause/serial/Pause FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestPause/serial/Pause]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p pause-022448 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p pause-022448 logs -n 25: (1.367332029s)
helpers_test.go:261: TestPause/serial/Pause logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                     ARGS                                                                      │          PROFILE          │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -p NoKubernetes-573547 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio                                         │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:25 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p missing-upgrade-381437 --memory=3072 --driver=docker  --container-runtime=crio                                                             │ missing-upgrade-381437    │ jenkins │ v1.35.0 │ 18 Dec 25 01:25 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio                         │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ delete  │ -p NoKubernetes-573547                                                                                                                        │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio                         │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ ssh     │ -p NoKubernetes-573547 sudo systemctl is-active --quiet service kubelet                                                                       │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │                     │
	│ start   │ -p missing-upgrade-381437 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                      │ missing-upgrade-381437    │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:27 UTC │
	│ stop    │ -p NoKubernetes-573547                                                                                                                        │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p NoKubernetes-573547 --driver=docker  --container-runtime=crio                                                                              │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ ssh     │ -p NoKubernetes-573547 sudo systemctl is-active --quiet service kubelet                                                                       │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │                     │
	│ delete  │ -p NoKubernetes-573547                                                                                                                        │ NoKubernetes-573547       │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:26 UTC │
	│ start   │ -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio      │ kubernetes-upgrade-823559 │ jenkins │ v1.37.0 │ 18 Dec 25 01:26 UTC │ 18 Dec 25 01:27 UTC │
	│ delete  │ -p missing-upgrade-381437                                                                                                                     │ missing-upgrade-381437    │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ start   │ -p stopped-upgrade-156815 --memory=3072 --vm-driver=docker  --container-runtime=crio                                                          │ stopped-upgrade-156815    │ jenkins │ v1.35.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ stop    │ -p kubernetes-upgrade-823559                                                                                                                  │ kubernetes-upgrade-823559 │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ start   │ -p kubernetes-upgrade-823559 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio │ kubernetes-upgrade-823559 │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │                     │
	│ stop    │ stopped-upgrade-156815 stop                                                                                                                   │ stopped-upgrade-156815    │ jenkins │ v1.35.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:27 UTC │
	│ start   │ -p stopped-upgrade-156815 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                      │ stopped-upgrade-156815    │ jenkins │ v1.37.0 │ 18 Dec 25 01:27 UTC │ 18 Dec 25 01:32 UTC │
	│ delete  │ -p stopped-upgrade-156815                                                                                                                     │ stopped-upgrade-156815    │ jenkins │ v1.37.0 │ 18 Dec 25 01:32 UTC │ 18 Dec 25 01:32 UTC │
	│ start   │ -p running-upgrade-850997 --memory=3072 --vm-driver=docker  --container-runtime=crio                                                          │ running-upgrade-850997    │ jenkins │ v1.35.0 │ 18 Dec 25 01:32 UTC │ 18 Dec 25 01:33 UTC │
	│ start   │ -p running-upgrade-850997 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                      │ running-upgrade-850997    │ jenkins │ v1.37.0 │ 18 Dec 25 01:33 UTC │ 18 Dec 25 01:37 UTC │
	│ delete  │ -p running-upgrade-850997                                                                                                                     │ running-upgrade-850997    │ jenkins │ v1.37.0 │ 18 Dec 25 01:37 UTC │ 18 Dec 25 01:37 UTC │
	│ start   │ -p pause-022448 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=crio                                     │ pause-022448              │ jenkins │ v1.37.0 │ 18 Dec 25 01:37 UTC │ 18 Dec 25 01:38 UTC │
	│ start   │ -p pause-022448 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio                                                              │ pause-022448              │ jenkins │ v1.37.0 │ 18 Dec 25 01:38 UTC │ 18 Dec 25 01:38 UTC │
	│ pause   │ -p pause-022448 --alsologtostderr -v=5                                                                                                        │ pause-022448              │ jenkins │ v1.37.0 │ 18 Dec 25 01:38 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 01:38:20
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 01:38:20.178210 1372607 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:38:20.178403 1372607 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:38:20.178434 1372607 out.go:374] Setting ErrFile to fd 2...
	I1218 01:38:20.178457 1372607 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:38:20.178920 1372607 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:38:20.179442 1372607 out.go:368] Setting JSON to false
	I1218 01:38:20.180617 1372607 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":30049,"bootTime":1765991852,"procs":198,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 01:38:20.180755 1372607 start.go:143] virtualization:  
	I1218 01:38:20.183852 1372607 out.go:179] * [pause-022448] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 01:38:20.186054 1372607 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 01:38:20.186120 1372607 notify.go:221] Checking for updates...
	I1218 01:38:20.191884 1372607 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 01:38:20.194771 1372607 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:38:20.197710 1372607 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 01:38:20.200688 1372607 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 01:38:20.203636 1372607 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 01:38:20.207011 1372607 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:20.207574 1372607 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 01:38:20.237905 1372607 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 01:38:20.238026 1372607 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:38:20.294866 1372607 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:2 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:51 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-18 01:38:20.284860013 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:38:20.294972 1372607 docker.go:319] overlay module found
	I1218 01:38:20.298058 1372607 out.go:179] * Using the docker driver based on existing profile
	I1218 01:38:20.300956 1372607 start.go:309] selected driver: docker
	I1218 01:38:20.300981 1372607 start.go:927] validating driver "docker" against &{Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false regi
stry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:38:20.301115 1372607 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 01:38:20.301236 1372607 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:38:20.359258 1372607 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:2 ContainersRunning:2 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:51 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-18 01:38:20.350476276 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:38:20.359647 1372607 cni.go:84] Creating CNI manager for ""
	I1218 01:38:20.359701 1372607 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:38:20.359753 1372607 start.go:353] cluster config:
	{Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:c
rio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false
storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:38:20.362991 1372607 out.go:179] * Starting "pause-022448" primary control-plane node in "pause-022448" cluster
	I1218 01:38:20.365720 1372607 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 01:38:20.368622 1372607 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1218 01:38:20.371409 1372607 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 01:38:20.371454 1372607 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 01:38:20.371466 1372607 cache.go:65] Caching tarball of preloaded images
	I1218 01:38:20.371488 1372607 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 01:38:20.371554 1372607 preload.go:238] Found /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 in cache, skipping download
	I1218 01:38:20.371565 1372607 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on crio
	I1218 01:38:20.371695 1372607 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/config.json ...
	I1218 01:38:20.395485 1372607 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1218 01:38:20.395504 1372607 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1218 01:38:20.395517 1372607 cache.go:243] Successfully downloaded all kic artifacts
	I1218 01:38:20.395555 1372607 start.go:360] acquireMachinesLock for pause-022448: {Name:mked9a692720255ef16f316861ffd9a1e4f4fa5c Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1218 01:38:20.395610 1372607 start.go:364] duration metric: took 37.078µs to acquireMachinesLock for "pause-022448"
	I1218 01:38:20.395630 1372607 start.go:96] Skipping create...Using existing machine configuration
	I1218 01:38:20.395635 1372607 fix.go:54] fixHost starting: 
	I1218 01:38:20.395885 1372607 cli_runner.go:164] Run: docker container inspect pause-022448 --format={{.State.Status}}
	I1218 01:38:20.423500 1372607 fix.go:112] recreateIfNeeded on pause-022448: state=Running err=<nil>
	W1218 01:38:20.423528 1372607 fix.go:138] unexpected machine state, will restart: <nil>
	I1218 01:38:20.426614 1372607 out.go:252] * Updating the running docker "pause-022448" container ...
	I1218 01:38:20.426651 1372607 machine.go:94] provisionDockerMachine start ...
	I1218 01:38:20.426754 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:20.452585 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:20.452912 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:20.452921 1372607 main.go:143] libmachine: About to run SSH command:
	hostname
	I1218 01:38:20.607980 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-022448
	
	I1218 01:38:20.608019 1372607 ubuntu.go:182] provisioning hostname "pause-022448"
	I1218 01:38:20.608086 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:20.625625 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:20.625934 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:20.625951 1372607 main.go:143] libmachine: About to run SSH command:
	sudo hostname pause-022448 && echo "pause-022448" | sudo tee /etc/hostname
	I1218 01:38:20.790700 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-022448
	
	I1218 01:38:20.790777 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:20.810422 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:20.810742 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:20.810764 1372607 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-022448' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-022448/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-022448' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1218 01:38:20.968737 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1218 01:38:20.968765 1372607 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22186-1156339/.minikube CaCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22186-1156339/.minikube}
	I1218 01:38:20.968793 1372607 ubuntu.go:190] setting up certificates
	I1218 01:38:20.968802 1372607 provision.go:84] configureAuth start
	I1218 01:38:20.968875 1372607 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-022448
	I1218 01:38:20.986662 1372607 provision.go:143] copyHostCerts
	I1218 01:38:20.986743 1372607 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem, removing ...
	I1218 01:38:20.986756 1372607 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem
	I1218 01:38:20.986831 1372607 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/cert.pem (1123 bytes)
	I1218 01:38:20.986934 1372607 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem, removing ...
	I1218 01:38:20.986943 1372607 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem
	I1218 01:38:20.986970 1372607 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/key.pem (1679 bytes)
	I1218 01:38:20.987026 1372607 exec_runner.go:144] found /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem, removing ...
	I1218 01:38:20.987034 1372607 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem
	I1218 01:38:20.987057 1372607 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.pem (1078 bytes)
	I1218 01:38:20.987104 1372607 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem org=jenkins.pause-022448 san=[127.0.0.1 192.168.85.2 localhost minikube pause-022448]
	I1218 01:38:21.213810 1372607 provision.go:177] copyRemoteCerts
	I1218 01:38:21.213877 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1218 01:38:21.213915 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:21.231304 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:21.341140 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1218 01:38:21.359392 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I1218 01:38:21.377862 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1218 01:38:21.397159 1372607 provision.go:87] duration metric: took 428.329775ms to configureAuth
	I1218 01:38:21.397185 1372607 ubuntu.go:206] setting minikube options for container-runtime
	I1218 01:38:21.397415 1372607 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:21.397533 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:21.422373 1372607 main.go:143] libmachine: Using SSH client type: native
	I1218 01:38:21.422766 1372607 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db9e0] 0x3ddee0 <nil>  [] 0s} 127.0.0.1 34170 <nil> <nil>}
	I1218 01:38:21.422788 1372607 main.go:143] libmachine: About to run SSH command:
	sudo mkdir -p /etc/sysconfig && printf %s "
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	" | sudo tee /etc/sysconfig/crio.minikube && sudo systemctl restart crio
	I1218 01:38:26.806476 1372607 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	CRIO_MINIKUBE_OPTIONS='--insecure-registry 10.96.0.0/12 '
	
	I1218 01:38:26.806497 1372607 machine.go:97] duration metric: took 6.379838428s to provisionDockerMachine
	I1218 01:38:26.806509 1372607 start.go:293] postStartSetup for "pause-022448" (driver="docker")
	I1218 01:38:26.806534 1372607 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1218 01:38:26.806601 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1218 01:38:26.806651 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:26.824156 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:26.933046 1372607 ssh_runner.go:195] Run: cat /etc/os-release
	I1218 01:38:26.936496 1372607 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1218 01:38:26.936522 1372607 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1218 01:38:26.936533 1372607 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/addons for local assets ...
	I1218 01:38:26.936586 1372607 filesync.go:126] Scanning /home/jenkins/minikube-integration/22186-1156339/.minikube/files for local assets ...
	I1218 01:38:26.936668 1372607 filesync.go:149] local asset: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem -> 11595522.pem in /etc/ssl/certs
	I1218 01:38:26.936783 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1218 01:38:26.944358 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:38:26.960952 1372607 start.go:296] duration metric: took 154.42761ms for postStartSetup
	I1218 01:38:26.961029 1372607 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:38:26.961071 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:26.977821 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:27.081422 1372607 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1218 01:38:27.086061 1372607 fix.go:56] duration metric: took 6.690419014s for fixHost
	I1218 01:38:27.086083 1372607 start.go:83] releasing machines lock for "pause-022448", held for 6.690463508s
	I1218 01:38:27.086153 1372607 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-022448
	I1218 01:38:27.102359 1372607 ssh_runner.go:195] Run: cat /version.json
	I1218 01:38:27.102414 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:27.102714 1372607 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1218 01:38:27.102772 1372607 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-022448
	I1218 01:38:27.122653 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:27.123202 1372607 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34170 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/pause-022448/id_rsa Username:docker}
	I1218 01:38:27.324693 1372607 ssh_runner.go:195] Run: systemctl --version
	I1218 01:38:27.330660 1372607 ssh_runner.go:195] Run: sudo sh -c "podman version >/dev/null"
	I1218 01:38:27.369263 1372607 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1218 01:38:27.373436 1372607 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1218 01:38:27.373541 1372607 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1218 01:38:27.381343 1372607 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1218 01:38:27.381368 1372607 start.go:496] detecting cgroup driver to use...
	I1218 01:38:27.381398 1372607 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1218 01:38:27.381444 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I1218 01:38:27.396156 1372607 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I1218 01:38:27.409071 1372607 docker.go:218] disabling cri-docker service (if available) ...
	I1218 01:38:27.409134 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1218 01:38:27.424734 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1218 01:38:27.437481 1372607 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1218 01:38:27.562532 1372607 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1218 01:38:27.699809 1372607 docker.go:234] disabling docker service ...
	I1218 01:38:27.699879 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1218 01:38:27.714713 1372607 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1218 01:38:27.727295 1372607 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1218 01:38:27.878545 1372607 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1218 01:38:28.044551 1372607 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1218 01:38:28.057546 1372607 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/crio/crio.sock
	" | sudo tee /etc/crictl.yaml"
	I1218 01:38:28.072139 1372607 crio.go:59] configure cri-o to use "registry.k8s.io/pause:3.10.1" pause image...
	I1218 01:38:28.072205 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*pause_image = .*$|pause_image = "registry.k8s.io/pause:3.10.1"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.081781 1372607 crio.go:70] configuring cri-o to use "cgroupfs" as cgroup driver...
	I1218 01:38:28.081898 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|^.*cgroup_manager = .*$|cgroup_manager = "cgroupfs"|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.091291 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i '/conmon_cgroup = .*/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.100508 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i '/cgroup_manager = .*/a conmon_cgroup = "pod"' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.109712 1372607 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1218 01:38:28.117680 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *"net.ipv4.ip_unprivileged_port_start=.*"/d' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.126856 1372607 ssh_runner.go:195] Run: sh -c "sudo grep -q "^ *default_sysctls" /etc/crio/crio.conf.d/02-crio.conf || sudo sed -i '/conmon_cgroup = .*/a default_sysctls = \[\n\]' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.135450 1372607 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^default_sysctls *= *\[|&\n  "net.ipv4.ip_unprivileged_port_start=0",|' /etc/crio/crio.conf.d/02-crio.conf"
	I1218 01:38:28.144368 1372607 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1218 01:38:28.151799 1372607 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1218 01:38:28.159436 1372607 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:38:28.297009 1372607 ssh_runner.go:195] Run: sudo systemctl restart crio
	I1218 01:38:28.543730 1372607 start.go:543] Will wait 60s for socket path /var/run/crio/crio.sock
	I1218 01:38:28.543849 1372607 ssh_runner.go:195] Run: stat /var/run/crio/crio.sock
	I1218 01:38:28.547461 1372607 start.go:564] Will wait 60s for crictl version
	I1218 01:38:28.547530 1372607 ssh_runner.go:195] Run: which crictl
	I1218 01:38:28.550761 1372607 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1218 01:38:28.578902 1372607 start.go:580] Version:  0.1.0
	RuntimeName:  cri-o
	RuntimeVersion:  1.34.3
	RuntimeApiVersion:  v1
	I1218 01:38:28.579066 1372607 ssh_runner.go:195] Run: crio --version
	I1218 01:38:28.609579 1372607 ssh_runner.go:195] Run: crio --version
	I1218 01:38:28.642456 1372607 out.go:179] * Preparing Kubernetes v1.34.3 on CRI-O 1.34.3 ...
	I1218 01:38:28.645402 1372607 cli_runner.go:164] Run: docker network inspect pause-022448 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1218 01:38:28.663331 1372607 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1218 01:38:28.667663 1372607 kubeadm.go:884] updating cluster {Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerName
s:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false regist
ry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1218 01:38:28.667812 1372607 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 01:38:28.667876 1372607 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 01:38:28.713646 1372607 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 01:38:28.713673 1372607 crio.go:433] Images already preloaded, skipping extraction
	I1218 01:38:28.713741 1372607 ssh_runner.go:195] Run: sudo crictl images --output json
	I1218 01:38:28.741356 1372607 crio.go:514] all images are preloaded for cri-o runtime.
	I1218 01:38:28.741376 1372607 cache_images.go:86] Images are preloaded, skipping loading
	I1218 01:38:28.741384 1372607 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.34.3 crio true true} ...
	I1218 01:38:28.741483 1372607 kubeadm.go:947] kubelet [Unit]
	Wants=crio.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --cgroups-per-qos=false --config=/var/lib/kubelet/config.yaml --enforce-node-allocatable= --hostname-override=pause-022448 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1218 01:38:28.741573 1372607 ssh_runner.go:195] Run: crio config
	I1218 01:38:28.813820 1372607 cni.go:84] Creating CNI manager for ""
	I1218 01:38:28.813843 1372607 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 01:38:28.813865 1372607 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1218 01:38:28.813887 1372607 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-022448 NodeName:pause-022448 DNSDomain:cluster.local CRISocket:/var/run/crio/crio.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernete
s/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/crio/crio.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1218 01:38:28.814019 1372607 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/crio/crio.sock
	  name: "pause-022448"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/crio/crio.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1218 01:38:28.814098 1372607 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1218 01:38:28.821849 1372607 binaries.go:51] Found k8s binaries, skipping transfer
	I1218 01:38:28.821967 1372607 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1218 01:38:28.829115 1372607 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (362 bytes)
	I1218 01:38:28.841124 1372607 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1218 01:38:28.852722 1372607 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2209 bytes)
	I1218 01:38:28.864281 1372607 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1218 01:38:28.867940 1372607 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:38:29.000376 1372607 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 01:38:29.014502 1372607 certs.go:69] Setting up /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448 for IP: 192.168.85.2
	I1218 01:38:29.014533 1372607 certs.go:195] generating shared ca certs ...
	I1218 01:38:29.014549 1372607 certs.go:227] acquiring lock for ca certs: {Name:mk9533cea3d0a0bf5565e9379af4d54f65bedc4d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:38:29.014722 1372607 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key
	I1218 01:38:29.014781 1372607 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key
	I1218 01:38:29.014792 1372607 certs.go:257] generating profile certs ...
	I1218 01:38:29.014905 1372607 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.key
	I1218 01:38:29.014989 1372607 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/apiserver.key.6e282b41
	I1218 01:38:29.015047 1372607 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/proxy-client.key
	I1218 01:38:29.015172 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem (1338 bytes)
	W1218 01:38:29.015222 1372607 certs.go:480] ignoring /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552_empty.pem, impossibly tiny 0 bytes
	I1218 01:38:29.015235 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca-key.pem (1679 bytes)
	I1218 01:38:29.015280 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/ca.pem (1078 bytes)
	I1218 01:38:29.015307 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/cert.pem (1123 bytes)
	I1218 01:38:29.015343 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/key.pem (1679 bytes)
	I1218 01:38:29.015392 1372607 certs.go:484] found cert: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem (1708 bytes)
	I1218 01:38:29.016071 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1218 01:38:29.034081 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1218 01:38:29.051630 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1218 01:38:29.068750 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1218 01:38:29.085542 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1218 01:38:29.102863 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1218 01:38:29.120574 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1218 01:38:29.138843 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1218 01:38:29.156856 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1218 01:38:29.175221 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/certs/1159552.pem --> /usr/share/ca-certificates/1159552.pem (1338 bytes)
	I1218 01:38:29.194997 1372607 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/ssl/certs/11595522.pem --> /usr/share/ca-certificates/11595522.pem (1708 bytes)
	I1218 01:38:29.216317 1372607 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1218 01:38:29.228656 1372607 ssh_runner.go:195] Run: openssl version
	I1218 01:38:29.234937 1372607 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.242063 1372607 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/1159552.pem /etc/ssl/certs/1159552.pem
	I1218 01:38:29.249045 1372607 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.252658 1372607 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 18 00:29 /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.252720 1372607 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1159552.pem
	I1218 01:38:29.294221 1372607 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1218 01:38:29.301641 1372607 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.309837 1372607 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/11595522.pem /etc/ssl/certs/11595522.pem
	I1218 01:38:29.317485 1372607 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.321249 1372607 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 18 00:29 /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.321311 1372607 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/11595522.pem
	I1218 01:38:29.362444 1372607 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1218 01:38:29.369840 1372607 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.377151 1372607 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1218 01:38:29.384343 1372607 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.399595 1372607 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 18 00:12 /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.399739 1372607 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1218 01:38:29.512585 1372607 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1218 01:38:29.539512 1372607 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1218 01:38:29.549234 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1218 01:38:29.669085 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1218 01:38:29.784103 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1218 01:38:29.847217 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1218 01:38:29.900574 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1218 01:38:29.954328 1372607 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1218 01:38:30.017688 1372607 kubeadm.go:401] StartCluster: {Name:pause-022448 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:pause-022448 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[
] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-
aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 01:38:30.017911 1372607 cri.go:54] listing CRI containers in root : {State:paused Name: Namespaces:[kube-system]}
	I1218 01:38:30.018018 1372607 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1218 01:38:30.079383 1372607 cri.go:89] found id: "8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73"
	I1218 01:38:30.079453 1372607 cri.go:89] found id: "350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570"
	I1218 01:38:30.079469 1372607 cri.go:89] found id: "a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747"
	I1218 01:38:30.079487 1372607 cri.go:89] found id: "e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3"
	I1218 01:38:30.079504 1372607 cri.go:89] found id: "0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68"
	I1218 01:38:30.079537 1372607 cri.go:89] found id: "c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02"
	I1218 01:38:30.079553 1372607 cri.go:89] found id: "9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60"
	I1218 01:38:30.079572 1372607 cri.go:89] found id: "1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b"
	I1218 01:38:30.079602 1372607 cri.go:89] found id: "ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8"
	I1218 01:38:30.079628 1372607 cri.go:89] found id: "0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a"
	I1218 01:38:30.079646 1372607 cri.go:89] found id: "c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e"
	I1218 01:38:30.079664 1372607 cri.go:89] found id: "a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a"
	I1218 01:38:30.079697 1372607 cri.go:89] found id: "167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	I1218 01:38:30.079714 1372607 cri.go:89] found id: "db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed"
	I1218 01:38:30.079731 1372607 cri.go:89] found id: ""
	I1218 01:38:30.079812 1372607 ssh_runner.go:195] Run: sudo runc list -f json
	W1218 01:38:30.099483 1372607 kubeadm.go:408] unpause failed: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T01:38:30Z" level=error msg="open /run/runc: no such file or directory"
	I1218 01:38:30.099643 1372607 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1218 01:38:30.112772 1372607 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1218 01:38:30.112834 1372607 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1218 01:38:30.112927 1372607 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1218 01:38:30.125157 1372607 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1218 01:38:30.125986 1372607 kubeconfig.go:125] found "pause-022448" server: "https://192.168.85.2:8443"
	I1218 01:38:30.126977 1372607 kapi.go:59] client config for pause-022448: &rest.Config{Host:"https://192.168.85.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:
[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 01:38:30.127855 1372607 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1218 01:38:30.127993 1372607 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1218 01:38:30.128023 1372607 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1218 01:38:30.128048 1372607 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1218 01:38:30.128081 1372607 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1218 01:38:30.128490 1372607 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1218 01:38:30.162975 1372607 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.85.2
	I1218 01:38:30.163056 1372607 kubeadm.go:602] duration metric: took 50.194643ms to restartPrimaryControlPlane
	I1218 01:38:30.163112 1372607 kubeadm.go:403] duration metric: took 145.405713ms to StartCluster
	I1218 01:38:30.163148 1372607 settings.go:142] acquiring lock: {Name:mkff738dcc016d79a7d7ac065fcd1bdaf0028027 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:38:30.163239 1372607 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 01:38:30.164256 1372607 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/kubeconfig: {Name:mkc9f9b47ec0c2f3aee28ec0d1c30d0b3b0d2ac4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 01:38:30.164556 1372607 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}
	I1218 01:38:30.165091 1372607 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1218 01:38:30.165390 1372607 config.go:182] Loaded profile config "pause-022448": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:38:30.169702 1372607 out.go:179] * Enabled addons: 
	I1218 01:38:30.169831 1372607 out.go:179] * Verifying Kubernetes components...
	I1218 01:38:30.172591 1372607 addons.go:530] duration metric: took 7.493704ms for enable addons: enabled=[]
	I1218 01:38:30.172769 1372607 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1218 01:38:30.450580 1372607 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1218 01:38:30.465846 1372607 node_ready.go:35] waiting up to 6m0s for node "pause-022448" to be "Ready" ...
	I1218 01:38:34.774950 1372607 node_ready.go:49] node "pause-022448" is "Ready"
	I1218 01:38:34.774976 1372607 node_ready.go:38] duration metric: took 4.309029376s for node "pause-022448" to be "Ready" ...
	I1218 01:38:34.774989 1372607 api_server.go:52] waiting for apiserver process to appear ...
	I1218 01:38:34.775044 1372607 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:38:34.792761 1372607 api_server.go:72] duration metric: took 4.628151076s to wait for apiserver process to appear ...
	I1218 01:38:34.792783 1372607 api_server.go:88] waiting for apiserver healthz status ...
	I1218 01:38:34.792802 1372607 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1218 01:38:34.823449 1372607 api_server.go:279] https://192.168.85.2:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[-]poststarthook/start-apiextensions-controllers failed: reason withheld
	[-]poststarthook/crd-informer-synced failed: reason withheld
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[-]poststarthook/start-service-ip-repair-controllers failed: reason withheld
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[-]poststarthook/priority-and-fairness-config-producer failed: reason withheld
	[-]poststarthook/bootstrap-controller failed: reason withheld
	[-]poststarthook/start-kubernetes-service-cidr-controller failed: reason withheld
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[-]poststarthook/apiservice-registration-controller failed: reason withheld
	[-]poststarthook/apiservice-discovery-controller failed: reason withheld
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1218 01:38:34.823530 1372607 api_server.go:103] status: https://192.168.85.2:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[-]poststarthook/start-apiextensions-controllers failed: reason withheld
	[-]poststarthook/crd-informer-synced failed: reason withheld
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[-]poststarthook/start-service-ip-repair-controllers failed: reason withheld
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[-]poststarthook/priority-and-fairness-config-producer failed: reason withheld
	[-]poststarthook/bootstrap-controller failed: reason withheld
	[-]poststarthook/start-kubernetes-service-cidr-controller failed: reason withheld
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[-]poststarthook/apiservice-registration-controller failed: reason withheld
	[-]poststarthook/apiservice-discovery-controller failed: reason withheld
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1218 01:38:35.293183 1372607 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1218 01:38:35.301869 1372607 api_server.go:279] https://192.168.85.2:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1218 01:38:35.301898 1372607 api_server.go:103] status: https://192.168.85.2:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1218 01:38:35.793366 1372607 api_server.go:253] Checking apiserver healthz at https://192.168.85.2:8443/healthz ...
	I1218 01:38:35.801387 1372607 api_server.go:279] https://192.168.85.2:8443/healthz returned 200:
	ok
	I1218 01:38:35.803455 1372607 api_server.go:141] control plane version: v1.34.3
	I1218 01:38:35.803515 1372607 api_server.go:131] duration metric: took 1.010724282s to wait for apiserver health ...
	I1218 01:38:35.803539 1372607 system_pods.go:43] waiting for kube-system pods to appear ...
	I1218 01:38:35.809938 1372607 system_pods.go:59] 7 kube-system pods found
	I1218 01:38:35.810022 1372607 system_pods.go:61] "coredns-66bc5c9577-dlr4v" [c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 01:38:35.810046 1372607 system_pods.go:61] "etcd-pause-022448" [c4afcbf9-5fc8-4e08-a983-02f03fd32c82] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1218 01:38:35.810067 1372607 system_pods.go:61] "kindnet-dgxx6" [ba231964-7cab-44f1-9f75-137449bd092a] Running
	I1218 01:38:35.810091 1372607 system_pods.go:61] "kube-apiserver-pause-022448" [c7b45649-bbb0-44a5-9adc-6fc3f68c528c] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1218 01:38:35.810113 1372607 system_pods.go:61] "kube-controller-manager-pause-022448" [1006641e-db44-4894-a967-88876508da33] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1218 01:38:35.810133 1372607 system_pods.go:61] "kube-proxy-5cwsd" [3e34e4da-78fd-4bff-907a-b2ae772e0726] Running
	I1218 01:38:35.810154 1372607 system_pods.go:61] "kube-scheduler-pause-022448" [d5553265-5da5-49ec-92bb-a0f9fd9747b1] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1218 01:38:35.810175 1372607 system_pods.go:74] duration metric: took 6.617126ms to wait for pod list to return data ...
	I1218 01:38:35.810197 1372607 default_sa.go:34] waiting for default service account to be created ...
	I1218 01:38:35.811941 1372607 default_sa.go:45] found service account: "default"
	I1218 01:38:35.811995 1372607 default_sa.go:55] duration metric: took 1.779772ms for default service account to be created ...
	I1218 01:38:35.812019 1372607 system_pods.go:116] waiting for k8s-apps to be running ...
	I1218 01:38:35.815337 1372607 system_pods.go:86] 7 kube-system pods found
	I1218 01:38:35.815408 1372607 system_pods.go:89] "coredns-66bc5c9577-dlr4v" [c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1218 01:38:35.815442 1372607 system_pods.go:89] "etcd-pause-022448" [c4afcbf9-5fc8-4e08-a983-02f03fd32c82] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1218 01:38:35.815474 1372607 system_pods.go:89] "kindnet-dgxx6" [ba231964-7cab-44f1-9f75-137449bd092a] Running
	I1218 01:38:35.815502 1372607 system_pods.go:89] "kube-apiserver-pause-022448" [c7b45649-bbb0-44a5-9adc-6fc3f68c528c] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1218 01:38:35.815524 1372607 system_pods.go:89] "kube-controller-manager-pause-022448" [1006641e-db44-4894-a967-88876508da33] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1218 01:38:35.815555 1372607 system_pods.go:89] "kube-proxy-5cwsd" [3e34e4da-78fd-4bff-907a-b2ae772e0726] Running
	I1218 01:38:35.815582 1372607 system_pods.go:89] "kube-scheduler-pause-022448" [d5553265-5da5-49ec-92bb-a0f9fd9747b1] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1218 01:38:35.815604 1372607 system_pods.go:126] duration metric: took 3.565401ms to wait for k8s-apps to be running ...
	I1218 01:38:35.815636 1372607 system_svc.go:44] waiting for kubelet service to be running ....
	I1218 01:38:35.815724 1372607 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:38:35.838788 1372607 system_svc.go:56] duration metric: took 23.144353ms WaitForService to wait for kubelet
	I1218 01:38:35.838870 1372607 kubeadm.go:587] duration metric: took 5.674262921s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1218 01:38:35.838913 1372607 node_conditions.go:102] verifying NodePressure condition ...
	I1218 01:38:35.850393 1372607 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1218 01:38:35.850425 1372607 node_conditions.go:123] node cpu capacity is 2
	I1218 01:38:35.850437 1372607 node_conditions.go:105] duration metric: took 11.50504ms to run NodePressure ...
	I1218 01:38:35.850459 1372607 start.go:242] waiting for startup goroutines ...
	I1218 01:38:35.850467 1372607 start.go:247] waiting for cluster config update ...
	I1218 01:38:35.850475 1372607 start.go:256] writing updated cluster config ...
	I1218 01:38:35.850774 1372607 ssh_runner.go:195] Run: rm -f paused
	I1218 01:38:35.855768 1372607 pod_ready.go:37] extra waiting up to 4m0s for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 01:38:35.856513 1372607 kapi.go:59] client config for pause-022448: &rest.Config{Host:"https://192.168.85.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.crt", KeyFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/pause-022448/client.key", CAFile:"/home/jenkins/minikube-integration/22186-1156339/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:
[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb51f0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1218 01:38:35.859695 1372607 pod_ready.go:83] waiting for pod "coredns-66bc5c9577-dlr4v" in "kube-system" namespace to be "Ready" or be gone ...
	W1218 01:38:37.865094 1372607 pod_ready.go:104] pod "coredns-66bc5c9577-dlr4v" is not "Ready", error: <nil>
	W1218 01:38:39.865432 1372607 pod_ready.go:104] pod "coredns-66bc5c9577-dlr4v" is not "Ready", error: <nil>
	I1218 01:38:41.864793 1372607 pod_ready.go:94] pod "coredns-66bc5c9577-dlr4v" is "Ready"
	I1218 01:38:41.864822 1372607 pod_ready.go:86] duration metric: took 6.005102516s for pod "coredns-66bc5c9577-dlr4v" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:41.867162 1372607 pod_ready.go:83] waiting for pod "etcd-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:42.372657 1372607 pod_ready.go:94] pod "etcd-pause-022448" is "Ready"
	I1218 01:38:42.372685 1372607 pod_ready.go:86] duration metric: took 505.496847ms for pod "etcd-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:42.375317 1372607 pod_ready.go:83] waiting for pod "kube-apiserver-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	W1218 01:38:44.381276 1372607 pod_ready.go:104] pod "kube-apiserver-pause-022448" is not "Ready", error: <nil>
	W1218 01:38:46.880868 1372607 pod_ready.go:104] pod "kube-apiserver-pause-022448" is not "Ready", error: <nil>
	I1218 01:38:47.880871 1372607 pod_ready.go:94] pod "kube-apiserver-pause-022448" is "Ready"
	I1218 01:38:47.880896 1372607 pod_ready.go:86] duration metric: took 5.505553093s for pod "kube-apiserver-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.883236 1372607 pod_ready.go:83] waiting for pod "kube-controller-manager-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.888151 1372607 pod_ready.go:94] pod "kube-controller-manager-pause-022448" is "Ready"
	I1218 01:38:47.888179 1372607 pod_ready.go:86] duration metric: took 4.915285ms for pod "kube-controller-manager-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.890525 1372607 pod_ready.go:83] waiting for pod "kube-proxy-5cwsd" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.895132 1372607 pod_ready.go:94] pod "kube-proxy-5cwsd" is "Ready"
	I1218 01:38:47.895158 1372607 pod_ready.go:86] duration metric: took 4.608948ms for pod "kube-proxy-5cwsd" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:47.897552 1372607 pod_ready.go:83] waiting for pod "kube-scheduler-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:48.079772 1372607 pod_ready.go:94] pod "kube-scheduler-pause-022448" is "Ready"
	I1218 01:38:48.079806 1372607 pod_ready.go:86] duration metric: took 182.18767ms for pod "kube-scheduler-pause-022448" in "kube-system" namespace to be "Ready" or be gone ...
	I1218 01:38:48.079818 1372607 pod_ready.go:40] duration metric: took 12.224019823s for extra waiting for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1218 01:38:48.134352 1372607 start.go:625] kubectl: 1.33.2, cluster: 1.34.3 (minor skew: 1)
	I1218 01:38:48.138956 1372607 out.go:179] * Done! kubectl is now configured to use "pause-022448" cluster and "default" namespace by default
	
	
	==> CRI-O <==
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.650285846Z" level=info msg="Started container" PID=2328 containerID=c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02 description=kube-system/kube-proxy-5cwsd/kube-proxy id=455d2865-bc3f-4508-8cab-dd78d768beb6 name=/runtime.v1.RuntimeService/StartContainer sandboxID=000b371f75dec4868286d3a04b8ec41966c40f32a8cd0123350ef1b3407753a1
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.685034241Z" level=info msg="Created container 350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570: kube-system/kube-scheduler-pause-022448/kube-scheduler" id=65361021-b6bd-4665-a7f9-7bf088ec46f9 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.686352454Z" level=info msg="Starting container: 350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570" id=9684a66e-03d9-411c-8466-26275b2b05e3 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.69242197Z" level=info msg="Created container e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3: kube-system/kindnet-dgxx6/kindnet-cni" id=92766cea-f357-4985-8162-73bfdf4aac35 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.696483783Z" level=info msg="Starting container: e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3" id=1cf4cb4d-ab3d-4df3-aacf-698fad58d397 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.704138541Z" level=info msg="Started container" PID=2374 containerID=350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570 description=kube-system/kube-scheduler-pause-022448/kube-scheduler id=9684a66e-03d9-411c-8466-26275b2b05e3 name=/runtime.v1.RuntimeService/StartContainer sandboxID=208cbf5a8b1beff70580cec37ac914fc2acca76de0ab701057fcec0b96433d14
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.714577545Z" level=info msg="Started container" PID=2357 containerID=e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3 description=kube-system/kindnet-dgxx6/kindnet-cni id=1cf4cb4d-ab3d-4df3-aacf-698fad58d397 name=/runtime.v1.RuntimeService/StartContainer sandboxID=7e3192171b757ca842418cf43e843866c352c7b7e9769061a932689239b82c3f
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.770785887Z" level=info msg="Created container 8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73: kube-system/etcd-pause-022448/etcd" id=b2cd9fc2-4dc9-44d7-9b95-da3cc2697336 name=/runtime.v1.RuntimeService/CreateContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.771462864Z" level=info msg="Starting container: 8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73" id=fc973e48-a765-4851-ba72-d8b15e5a3e27 name=/runtime.v1.RuntimeService/StartContainer
	Dec 18 01:38:29 pause-022448 crio[2101]: time="2025-12-18T01:38:29.773490095Z" level=info msg="Started container" PID=2388 containerID=8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73 description=kube-system/etcd-pause-022448/etcd id=fc973e48-a765-4851-ba72-d8b15e5a3e27 name=/runtime.v1.RuntimeService/StartContainer sandboxID=34e2e5f0c28e0469728c35fc292cb7ef48476fd2edb359e390c3102ab8f5fa4f
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.146035063Z" level=info msg="CNI monitoring event CREATE        \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15051887Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.150554988Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15057989Z" level=info msg="CNI monitoring event WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15393696Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.153975843Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.153999769Z" level=info msg="CNI monitoring event WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.157511485Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.15754702Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.157569575Z" level=info msg="CNI monitoring event RENAME        \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.161719279Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.161759836Z" level=info msg="Updated default CNI network name to kindnet"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.161786297Z" level=info msg="CNI monitoring event CREATE        \"/etc/cni/net.d/10-kindnet.conflist\" ← \"/etc/cni/net.d/10-kindnet.conflist.temp\""
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.16517832Z" level=info msg="Found CNI network kindnet (type=ptp) at /etc/cni/net.d/10-kindnet.conflist"
	Dec 18 01:38:40 pause-022448 crio[2101]: time="2025-12-18T01:38:40.165216448Z" level=info msg="Updated default CNI network name to kindnet"
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                CREATED              STATE               NAME                      ATTEMPT             POD ID              POD                                    NAMESPACE
	8a80f1b4ef9e2       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42                                     24 seconds ago       Running             etcd                      1                   34e2e5f0c28e0       etcd-pause-022448                      kube-system
	350ef333bdeea       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6                                     24 seconds ago       Running             kube-scheduler            1                   208cbf5a8b1be       kube-scheduler-pause-022448            kube-system
	a94767c237847       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896                                     24 seconds ago       Running             kube-apiserver            1                   91ec02eee6371       kube-apiserver-pause-022448            kube-system
	e0a20554f5b0e       c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13                                     24 seconds ago       Running             kindnet-cni               1                   7e3192171b757       kindnet-dgxx6                          kube-system
	0162b54e23cb5       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc                                     24 seconds ago       Running             coredns                   1                   e8347e2183b9b       coredns-66bc5c9577-dlr4v               kube-system
	c0914c54d5ff2       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162                                     24 seconds ago       Running             kube-proxy                1                   000b371f75dec       kube-proxy-5cwsd                       kube-system
	9b6ce068b25af       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22                                     24 seconds ago       Running             kube-controller-manager   1                   1ab511191087d       kube-controller-manager-pause-022448   kube-system
	1ac523005a68f       138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc                                     36 seconds ago       Exited              coredns                   0                   e8347e2183b9b       coredns-66bc5c9577-dlr4v               kube-system
	ad0758ba82bfa       docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3   47 seconds ago       Exited              kindnet-cni               0                   7e3192171b757       kindnet-dgxx6                          kube-system
	0cefb58714e9a       4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162                                     49 seconds ago       Exited              kube-proxy                0                   000b371f75dec       kube-proxy-5cwsd                       kube-system
	c9db25fe09d87       cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896                                     About a minute ago   Exited              kube-apiserver            0                   91ec02eee6371       kube-apiserver-pause-022448            kube-system
	a1287f0a1dfc7       7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22                                     About a minute ago   Exited              kube-controller-manager   0                   1ab511191087d       kube-controller-manager-pause-022448   kube-system
	167bdfb1e99e7       2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42                                     About a minute ago   Exited              etcd                      0                   34e2e5f0c28e0       etcd-pause-022448                      kube-system
	db6702726086b       2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6                                     About a minute ago   Exited              kube-scheduler            0                   208cbf5a8b1be       kube-scheduler-pause-022448            kube-system
	
	
	==> coredns [0162b54e23cb54047bc95383db70202f778a1c5d4f4fcfd46009857336b61a68] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = fa9a0cdcdddcb4be74a0eaf7cfcb211c40e29ddf5507e03bbfc0065bade31f0f2641a2513136e246f32328dd126fc93236fb5c595246f0763926a524386705e8
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:55089 - 22251 "HINFO IN 3841722693218782860.4026371041655979999. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.026485187s
	
	
	==> coredns [1ac523005a68ff5c94d7036fabf8355333cd3f9b3e4d677fcdffcc226e43044b] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = fa9a0cdcdddcb4be74a0eaf7cfcb211c40e29ddf5507e03bbfc0065bade31f0f2641a2513136e246f32328dd126fc93236fb5c595246f0763926a524386705e8
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:45272 - 33276 "HINFO IN 3661192616598581439.7880941379114653225. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.011616725s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               pause-022448
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=pause-022448
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=2e96f676eb7e96389e85fe0658a4ede4c4ba6924
	                    minikube.k8s.io/name=pause-022448
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_12_18T01_38_00_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 18 Dec 2025 01:37:56 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-022448
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 18 Dec 2025 01:38:45 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:37:52 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:37:52 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:37:52 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 18 Dec 2025 01:38:45 +0000   Thu, 18 Dec 2025 01:38:16 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.85.2
	  Hostname:    pause-022448
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	System Info:
	  Machine ID:                 02ff784b806e34735a6e229a69428228
	  System UUID:                a02f202d-16ca-4496-9551-42799335e19c
	  Boot ID:                    57207cc2-434a-4297-a7b8-47b6fa2e7487
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  cri-o://1.34.3
	  Kubelet Version:            v1.34.3
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                    CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                    ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-66bc5c9577-dlr4v                100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     50s
	  kube-system                 etcd-pause-022448                       100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         55s
	  kube-system                 kindnet-dgxx6                           100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      50s
	  kube-system                 kube-apiserver-pause-022448             250m (12%)    0 (0%)      0 (0%)           0 (0%)         55s
	  kube-system                 kube-controller-manager-pause-022448    200m (10%)    0 (0%)      0 (0%)           0 (0%)         55s
	  kube-system                 kube-proxy-5cwsd                        0 (0%)        0 (0%)      0 (0%)           0 (0%)         50s
	  kube-system                 kube-scheduler-pause-022448             100m (5%)     0 (0%)      0 (0%)           0 (0%)         55s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age   From             Message
	  ----     ------                   ----  ----             -------
	  Normal   Starting                 48s   kube-proxy       
	  Normal   Starting                 19s   kube-proxy       
	  Normal   Starting                 55s   kubelet          Starting kubelet.
	  Warning  CgroupV1                 55s   kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  55s   kubelet          Node pause-022448 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    55s   kubelet          Node pause-022448 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     55s   kubelet          Node pause-022448 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           51s   node-controller  Node pause-022448 event: Registered Node pause-022448 in Controller
	  Normal   NodeReady                38s   kubelet          Node pause-022448 status is now: NodeReady
	  Normal   RegisteredNode           17s   node-controller  Node pause-022448 event: Registered Node pause-022448 in Controller
	
	
	==> dmesg <==
	[  +3.441233] overlayfs: idmapped layers are currently not supported
	[ +34.067705] overlayfs: idmapped layers are currently not supported
	[Dec18 01:06] overlayfs: idmapped layers are currently not supported
	[Dec18 01:07] overlayfs: idmapped layers are currently not supported
	[  +2.867801] overlayfs: idmapped layers are currently not supported
	[Dec18 01:08] overlayfs: idmapped layers are currently not supported
	[Dec18 01:09] overlayfs: idmapped layers are currently not supported
	[Dec18 01:10] overlayfs: idmapped layers are currently not supported
	[Dec18 01:14] overlayfs: idmapped layers are currently not supported
	[Dec18 01:15] overlayfs: idmapped layers are currently not supported
	[Dec18 01:16] overlayfs: idmapped layers are currently not supported
	[ +41.843420] overlayfs: idmapped layers are currently not supported
	[Dec18 01:17] overlayfs: idmapped layers are currently not supported
	[Dec18 01:18] overlayfs: idmapped layers are currently not supported
	[Dec18 01:19] overlayfs: idmapped layers are currently not supported
	[  +7.804932] overlayfs: idmapped layers are currently not supported
	[Dec18 01:20] overlayfs: idmapped layers are currently not supported
	[ +26.176950] overlayfs: idmapped layers are currently not supported
	[Dec18 01:21] overlayfs: idmapped layers are currently not supported
	[ +26.122242] overlayfs: idmapped layers are currently not supported
	[Dec18 01:22] overlayfs: idmapped layers are currently not supported
	[Dec18 01:23] overlayfs: idmapped layers are currently not supported
	[Dec18 01:25] overlayfs: idmapped layers are currently not supported
	[Dec18 01:27] overlayfs: idmapped layers are currently not supported
	[Dec18 01:37] overlayfs: idmapped layers are currently not supported
	
	
	==> etcd [167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99] <==
	{"level":"warn","ts":"2025-12-18T01:37:55.449346Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49916","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.464947Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49924","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.483624Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49944","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.512841Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49954","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.532836Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49972","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.548521Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:49976","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:37:55.633154Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:50004","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-12-18T01:38:21.598370Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-12-18T01:38:21.598432Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"pause-022448","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.85.2:2380"],"advertise-client-urls":["https://192.168.85.2:2379"]}
	{"level":"error","ts":"2025-12-18T01:38:21.598534Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T01:38:21.875763Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-12-18T01:38:21.875866Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T01:38:21.875888Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"9f0758e1c58a86ed","current-leader-member-id":"9f0758e1c58a86ed"}
	{"level":"info","ts":"2025-12-18T01:38:21.875993Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"info","ts":"2025-12-18T01:38:21.876010Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876052Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876123Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T01:38:21.876159Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876258Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.85.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-12-18T01:38:21.876277Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.85.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-12-18T01:38:21.876284Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.85.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T01:38:21.879047Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.85.2:2380"}
	{"level":"error","ts":"2025-12-18T01:38:21.879121Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.85.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-12-18T01:38:21.879158Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.85.2:2380"}
	{"level":"info","ts":"2025-12-18T01:38:21.879165Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"pause-022448","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.85.2:2380"],"advertise-client-urls":["https://192.168.85.2:2379"]}
	
	
	==> etcd [8a80f1b4ef9e2a4991c8fc6f01a75b52e03d19f20ab9e50254a8f3c39e731b73] <==
	{"level":"warn","ts":"2025-12-18T01:38:33.434992Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60562","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.453542Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60574","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.477288Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60592","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.489475Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60614","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.532162Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60632","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.533303Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60646","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.549757Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60668","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.572818Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60676","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.590340Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60692","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.613977Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60710","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.631338Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60728","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.643929Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60748","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.674419Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60760","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.683109Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60766","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.698828Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60784","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.714424Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60810","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.729104Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60826","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.745525Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60858","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.763810Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60878","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.777515Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60894","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.793995Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60912","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.833347Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60924","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.860299Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60956","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.874000Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:60980","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-18T01:38:33.929208Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:32780","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 01:38:54 up  8:21,  0 user,  load average: 2.60, 1.74, 1.86
	Linux pause-022448 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [ad0758ba82bfa40939449535a15f147c444da28b35a7a89947cd67d18b422eb8] <==
	I1218 01:38:06.628124       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 01:38:06.628550       1 main.go:139] hostIP = 192.168.85.2
	podIP = 192.168.85.2
	I1218 01:38:06.628684       1 main.go:148] setting mtu 1500 for CNI 
	I1218 01:38:06.628696       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 01:38:06.628708       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T01:38:06Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1218 01:38:06.828981       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 01:38:06.829167       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 01:38:06.829206       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 01:38:06.832035       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1218 01:38:07.031989       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1218 01:38:07.032076       1 metrics.go:72] Registering metrics
	I1218 01:38:07.032155       1 controller.go:711] "Syncing nftables rules"
	I1218 01:38:16.832903       1 main.go:297] Handling node with IPs: map[192.168.85.2:{}]
	I1218 01:38:16.832958       1 main.go:301] handling current node
	
	
	==> kindnet [e0a20554f5b0e0efbbe67e8f2eab3a25a33b19c7fa8a9d0fad7b3d799be103e3] <==
	I1218 01:38:29.877429       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1218 01:38:29.877780       1 main.go:139] hostIP = 192.168.85.2
	podIP = 192.168.85.2
	I1218 01:38:29.892469       1 main.go:148] setting mtu 1500 for CNI 
	I1218 01:38:29.896246       1 main.go:178] kindnetd IP family: "ipv4"
	I1218 01:38:29.896297       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-18T01:38:30Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1218 01:38:30.145929       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1218 01:38:30.146016       1 controller.go:381] "Waiting for informer caches to sync"
	I1218 01:38:30.146051       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1218 01:38:30.150995       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1218 01:38:34.858241       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1218 01:38:34.858278       1 metrics.go:72] Registering metrics
	I1218 01:38:34.858349       1 controller.go:711] "Syncing nftables rules"
	I1218 01:38:40.145640       1 main.go:297] Handling node with IPs: map[192.168.85.2:{}]
	I1218 01:38:40.145705       1 main.go:301] handling current node
	I1218 01:38:50.145780       1 main.go:297] Handling node with IPs: map[192.168.85.2:{}]
	I1218 01:38:50.145816       1 main.go:301] handling current node
	
	
	==> kube-apiserver [a94767c2378471cdb6680f59e5e494dcdf0ebd548fac3cf795b5e3cf8ba6a747] <==
	I1218 01:38:34.853946       1 cache.go:39] Caches are synced for LocalAvailability controller
	I1218 01:38:34.854178       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I1218 01:38:34.854196       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I1218 01:38:34.854330       1 shared_informer.go:356] "Caches are synced" controller="configmaps"
	I1218 01:38:34.854551       1 handler_discovery.go:451] Starting ResourceDiscoveryManager
	I1218 01:38:34.854647       1 shared_informer.go:356] "Caches are synced" controller="node_authorizer"
	I1218 01:38:34.866953       1 shared_informer.go:356] "Caches are synced" controller="kubernetes-service-cidr-controller"
	I1218 01:38:34.867070       1 default_servicecidr_controller.go:137] Shutting down kubernetes-service-cidr-controller
	I1218 01:38:34.872554       1 shared_informer.go:356] "Caches are synced" controller="*generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]"
	I1218 01:38:34.872598       1 policy_source.go:240] refreshing policies
	I1218 01:38:34.872642       1 shared_informer.go:356] "Caches are synced" controller="crd-autoregister"
	I1218 01:38:34.872683       1 shared_informer.go:356] "Caches are synced" controller="ipallocator-repair-controller"
	I1218 01:38:34.872802       1 aggregator.go:171] initial CRD sync complete...
	I1218 01:38:34.872817       1 autoregister_controller.go:144] Starting autoregister controller
	I1218 01:38:34.872823       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I1218 01:38:34.872829       1 cache.go:39] Caches are synced for autoregister controller
	E1218 01:38:34.876537       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I1218 01:38:34.882066       1 cidrallocator.go:301] created ClusterIP allocator for Service CIDR 10.96.0.0/12
	I1218 01:38:34.894320       1 controller.go:667] quota admission added evaluator for: leases.coordination.k8s.io
	I1218 01:38:35.451790       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1218 01:38:35.786384       1 controller.go:667] quota admission added evaluator for: serviceaccounts
	I1218 01:38:37.230921       1 controller.go:667] quota admission added evaluator for: replicasets.apps
	I1218 01:38:37.278976       1 controller.go:667] quota admission added evaluator for: endpoints
	I1218 01:38:37.529216       1 controller.go:667] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I1218 01:38:37.581168       1 controller.go:667] quota admission added evaluator for: deployments.apps
	
	
	==> kube-apiserver [c9db25fe09d87c9c002d2e768f43aec9193e4ba2fc0d81a37e70c59df5e7ae4e] <==
	W1218 01:38:21.618173       1 logging.go:55] [core] [Channel #243 SubChannel #245]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618253       1 logging.go:55] [core] [Channel #55 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617380       1 logging.go:55] [core] [Channel #27 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618369       1 logging.go:55] [core] [Channel #103 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617507       1 logging.go:55] [core] [Channel #247 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618474       1 logging.go:55] [core] [Channel #143 SubChannel #145]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617637       1 logging.go:55] [core] [Channel #75 SubChannel #77]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618604       1 logging.go:55] [core] [Channel #147 SubChannel #149]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.617958       1 logging.go:55] [core] [Channel #163 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618689       1 logging.go:55] [core] [Channel #215 SubChannel #217]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618141       1 logging.go:55] [core] [Channel #195 SubChannel #197]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618266       1 logging.go:55] [core] [Channel #67 SubChannel #69]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.616649       1 logging.go:55] [core] [Channel #139 SubChannel #141]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618337       1 logging.go:55] [core] [Channel #135 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618433       1 logging.go:55] [core] [Channel #119 SubChannel #121]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.618532       1 logging.go:55] [core] [Channel #175 SubChannel #177]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619078       1 logging.go:55] [core] [Channel #251 SubChannel #253]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619217       1 logging.go:55] [core] [Channel #59 SubChannel #61]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619296       1 logging.go:55] [core] [Channel #115 SubChannel #117]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619368       1 logging.go:55] [core] [Channel #123 SubChannel #125]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619624       1 logging.go:55] [core] [Channel #35 SubChannel #37]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619724       1 logging.go:55] [core] [Channel #87 SubChannel #89]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619780       1 logging.go:55] [core] [Channel #131 SubChannel #133]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.619860       1 logging.go:55] [core] [Channel #11 SubChannel #14]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W1218 01:38:21.620165       1 logging.go:55] [core] [Channel #231 SubChannel #233]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [9b6ce068b25afaed97ed3ee737b3ea40a3f51b2655b409c819cc4df7134e7c60] <==
	I1218 01:38:37.173522       1 shared_informer.go:356] "Caches are synced" controller="ReplicationController"
	I1218 01:38:37.175993       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1218 01:38:37.177351       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1218 01:38:37.179025       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1218 01:38:37.181244       1 shared_informer.go:356] "Caches are synced" controller="crt configmap"
	I1218 01:38:37.182438       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1218 01:38:37.185584       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1218 01:38:37.189775       1 shared_informer.go:356] "Caches are synced" controller="node"
	I1218 01:38:37.189826       1 range_allocator.go:177] "Sending events to api server" logger="node-ipam-controller"
	I1218 01:38:37.189849       1 range_allocator.go:183] "Starting range CIDR allocator" logger="node-ipam-controller"
	I1218 01:38:37.189854       1 shared_informer.go:349] "Waiting for caches to sync" controller="cidrallocator"
	I1218 01:38:37.189859       1 shared_informer.go:356] "Caches are synced" controller="cidrallocator"
	I1218 01:38:37.193332       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1218 01:38:37.195677       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1218 01:38:37.198909       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1218 01:38:37.201152       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1218 01:38:37.203389       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 01:38:37.207522       1 shared_informer.go:356] "Caches are synced" controller="namespace"
	I1218 01:38:37.216881       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 01:38:37.216912       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1218 01:38:37.216922       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1218 01:38:37.220292       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1218 01:38:37.222604       1 shared_informer.go:356] "Caches are synced" controller="stateful set"
	I1218 01:38:37.222661       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice"
	I1218 01:38:37.223074       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	
	
	==> kube-controller-manager [a1287f0a1dfc7f7538b9abd12c4221e7ed37733963fea216ccfa3f7ccd5aa35a] <==
	I1218 01:38:03.441618       1 shared_informer.go:349] "Waiting for caches to sync" controller="cidrallocator"
	I1218 01:38:03.441662       1 shared_informer.go:356] "Caches are synced" controller="cidrallocator"
	I1218 01:38:03.442084       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1218 01:38:03.451774       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1218 01:38:03.456331       1 shared_informer.go:356] "Caches are synced" controller="ReplicaSet"
	I1218 01:38:03.457418       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1218 01:38:03.457950       1 range_allocator.go:428] "Set node PodCIDR" logger="node-ipam-controller" node="pause-022448" podCIDRs=["10.244.0.0/24"]
	I1218 01:38:03.464302       1 shared_informer.go:356] "Caches are synced" controller="service-cidr-controller"
	I1218 01:38:03.470810       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1218 01:38:03.478863       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1218 01:38:03.478947       1 shared_informer.go:356] "Caches are synced" controller="stateful set"
	I1218 01:38:03.479003       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1218 01:38:03.479063       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1218 01:38:03.479084       1 shared_informer.go:356] "Caches are synced" controller="daemon sets"
	I1218 01:38:03.479117       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1218 01:38:03.479144       1 shared_informer.go:356] "Caches are synced" controller="deployment"
	I1218 01:38:03.479159       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1218 01:38:03.481493       1 shared_informer.go:356] "Caches are synced" controller="expand"
	I1218 01:38:03.481735       1 shared_informer.go:356] "Caches are synced" controller="HPA"
	I1218 01:38:03.481745       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1218 01:38:03.481755       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1218 01:38:03.481764       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1218 01:38:03.488044       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1218 01:38:03.499113       1 shared_informer.go:356] "Caches are synced" controller="cronjob"
	I1218 01:38:18.431653       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	
	
	==> kube-proxy [0cefb58714e9a197bcf9195f46f58dbdc1581736c8bc69dfb1779638872d0e6a] <==
	I1218 01:38:04.854025       1 server_linux.go:53] "Using iptables proxy"
	I1218 01:38:04.937684       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1218 01:38:05.037853       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1218 01:38:05.037960       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.85.2"]
	E1218 01:38:05.038167       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1218 01:38:05.059416       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1218 01:38:05.059536       1 server_linux.go:132] "Using iptables Proxier"
	I1218 01:38:05.063681       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1218 01:38:05.063981       1 server.go:527] "Version info" version="v1.34.3"
	I1218 01:38:05.064004       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 01:38:05.065523       1 config.go:200] "Starting service config controller"
	I1218 01:38:05.065547       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1218 01:38:05.065568       1 config.go:106] "Starting endpoint slice config controller"
	I1218 01:38:05.065572       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1218 01:38:05.065596       1 config.go:403] "Starting serviceCIDR config controller"
	I1218 01:38:05.065691       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1218 01:38:05.066305       1 config.go:309] "Starting node config controller"
	I1218 01:38:05.066323       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1218 01:38:05.066329       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1218 01:38:05.166335       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1218 01:38:05.166401       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1218 01:38:05.166545       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-proxy [c0914c54d5ff2e1aa458c17d0b27f459b5a5ab6703169b3e164e618ae38b5c02] <==
	I1218 01:38:33.315668       1 server_linux.go:53] "Using iptables proxy"
	I1218 01:38:33.794499       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1218 01:38:34.895701       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1218 01:38:34.895733       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.85.2"]
	E1218 01:38:34.895832       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1218 01:38:34.934437       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1218 01:38:34.935131       1 server_linux.go:132] "Using iptables Proxier"
	I1218 01:38:34.948973       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1218 01:38:34.949303       1 server.go:527] "Version info" version="v1.34.3"
	I1218 01:38:34.949482       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 01:38:34.951700       1 config.go:200] "Starting service config controller"
	I1218 01:38:34.951764       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1218 01:38:34.951807       1 config.go:106] "Starting endpoint slice config controller"
	I1218 01:38:34.951835       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1218 01:38:34.951871       1 config.go:403] "Starting serviceCIDR config controller"
	I1218 01:38:34.951896       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1218 01:38:34.952678       1 config.go:309] "Starting node config controller"
	I1218 01:38:34.952749       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1218 01:38:34.952780       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1218 01:38:35.052744       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1218 01:38:35.053526       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1218 01:38:35.053552       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [350ef333bdeea99a71d93406f0ff639946c6aff5d3f12f6c242d013d61967570] <==
	I1218 01:38:32.710955       1 serving.go:386] Generated self-signed cert in-memory
	W1218 01:38:34.770003       1 requestheader_controller.go:204] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W1218 01:38:34.770119       1 authentication.go:397] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W1218 01:38:34.770156       1 authentication.go:398] Continuing without authentication configuration. This may treat all requests as anonymous.
	W1218 01:38:34.770198       1 authentication.go:399] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I1218 01:38:34.838850       1 server.go:175] "Starting Kubernetes Scheduler" version="v1.34.3"
	I1218 01:38:34.838957       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1218 01:38:34.841033       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:34.841118       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:34.841576       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	I1218 01:38:34.841680       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I1218 01:38:34.942329       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kube-scheduler [db6702726086bd3d00640de7001082b0972c84d9e4ecba425ff114e9b97f80ed] <==
	E1218 01:37:56.545224       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1218 01:37:56.545269       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1218 01:37:56.545308       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1218 01:37:56.545351       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1218 01:37:56.545413       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1218 01:37:56.545460       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1218 01:37:56.545504       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 01:37:56.545546       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1218 01:37:56.545636       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1218 01:37:56.545687       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 01:37:56.545722       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1218 01:37:56.544962       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1218 01:37:57.367785       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1218 01:37:57.378572       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1218 01:37:57.504599       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1218 01:37:57.525795       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1218 01:37:57.539666       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1218 01:37:57.905852       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1218 01:37:59.794816       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:21.594489       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1218 01:38:21.594512       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1218 01:38:21.594532       1 tlsconfig.go:258] "Shutting down DynamicServingCertificateController"
	I1218 01:38:21.594560       1 configmap_cafile_content.go:226] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1218 01:38:21.594713       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1218 01:38:21.594733       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kubelet <==
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497421    1316 controller.go:195] "Failed to update lease" err="Put \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497646    1316 controller.go:195] "Failed to update lease" err="Put \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497874    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-dlr4v\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505" pod="kube-system/coredns-66bc5c9577-dlr4v"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.497952    1316 controller.go:195] "Failed to update lease" err="Put \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: I1218 01:38:29.497974    1316 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.498190    1316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused" interval="200ms"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: I1218 01:38:29.509269    1316 scope.go:117] "RemoveContainer" containerID="167bdfb1e99e759086742f5f5b9177f2825cae292574000fb98d46ba3883be99"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.509770    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-proxy-5cwsd\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="3e34e4da-78fd-4bff-907a-b2ae772e0726" pod="kube-system/kube-proxy-5cwsd"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.509929    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/coredns-66bc5c9577-dlr4v\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505" pod="kube-system/coredns-66bc5c9577-dlr4v"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510075    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="ea3387825836e7fd100b14a1e723ead4" pod="kube-system/kube-controller-manager-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510218    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="9399ade1e13df4d0bf1cb0e0daf33ecc" pod="kube-system/kube-scheduler-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510360    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/etcd-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="4bd8d366a1589788e68113f96a17c367" pod="kube-system/etcd-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510527    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-pause-022448\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="3bee34e012a9123c51f981d1d256b412" pod="kube-system/kube-apiserver-pause-022448"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.510682    1316 status_manager.go:1018] "Failed to get status for pod" err="Get \"https://192.168.85.2:8443/api/v1/namespaces/kube-system/pods/kindnet-dgxx6\": dial tcp 192.168.85.2:8443: connect: connection refused" podUID="ba231964-7cab-44f1-9f75-137449bd092a" pod="kube-system/kindnet-dgxx6"
	Dec 18 01:38:29 pause-022448 kubelet[1316]: E1218 01:38:29.698894    1316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://192.168.85.2:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/pause-022448?timeout=10s\": dial tcp 192.168.85.2:8443: connect: connection refused" interval="400ms"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.693944    1316 status_manager.go:1018] "Failed to get status for pod" err="pods \"kube-proxy-5cwsd\" is forbidden: User \"system:node:pause-022448\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" podUID="3e34e4da-78fd-4bff-907a-b2ae772e0726" pod="kube-system/kube-proxy-5cwsd"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.694628    1316 reflector.go:205] "Failed to watch" err="configmaps \"coredns\" is forbidden: User \"system:node:pause-022448\" cannot watch resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"coredns\"" type="*v1.ConfigMap"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.694683    1316 reflector.go:205] "Failed to watch" err="configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:pause-022448\" cannot watch resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.694708    1316 reflector.go:205] "Failed to watch" err="configmaps \"kube-proxy\" is forbidden: User \"system:node:pause-022448\" cannot watch resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"kube-proxy\"" type="*v1.ConfigMap"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.765576    1316 status_manager.go:1018] "Failed to get status for pod" err="pods \"coredns-66bc5c9577-dlr4v\" is forbidden: User \"system:node:pause-022448\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" podUID="c64ad0c8-e035-4cf4-b4ae-14f8f6ad0505" pod="kube-system/coredns-66bc5c9577-dlr4v"
	Dec 18 01:38:34 pause-022448 kubelet[1316]: E1218 01:38:34.793956    1316 status_manager.go:1018] "Failed to get status for pod" err="pods \"kube-controller-manager-pause-022448\" is forbidden: User \"system:node:pause-022448\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'pause-022448' and this object" podUID="ea3387825836e7fd100b14a1e723ead4" pod="kube-system/kube-controller-manager-pause-022448"
	Dec 18 01:38:39 pause-022448 kubelet[1316]: W1218 01:38:39.369468    1316 conversion.go:112] Could not get instant cpu stats: cumulative stats decrease
	Dec 18 01:38:48 pause-022448 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent...
	Dec 18 01:38:48 pause-022448 systemd[1]: kubelet.service: Deactivated successfully.
	Dec 18 01:38:48 pause-022448 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p pause-022448 -n pause-022448
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p pause-022448 -n pause-022448: exit status 2 (381.87368ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:270: (dbg) Run:  kubectl --context pause-022448 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:294: <<< TestPause/serial/Pause FAILED: end of post-mortem logs <<<
helpers_test.go:295: ---------------------/post-mortem---------------------------------
--- FAIL: TestPause/serial/Pause (6.93s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (7200.079s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:285: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:338: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
panic: test timed out after 2h0m0s
	running tests:
		TestNetworkPlugins (32m26s)
		TestNetworkPlugins/group/flannel (45s)
		TestNetworkPlugins/group/flannel/Start (45s)
		TestStartStop (34m33s)
		TestStartStop/group/no-preload (25m46s)
		TestStartStop/group/no-preload/serial (25m46s)
		TestStartStop/group/no-preload/serial/AddonExistsAfterStop (32s)

                                                
                                                
goroutine 6101 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2682 +0x2b0
created by time.goFunc
	/usr/local/go/src/time/sleep.go:215 +0x38

                                                
                                                
goroutine 1 [chan receive, 28 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x40004e8540, 0x40006b5bb8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
testing.runTests(0x400000e0d8, {0x534c680, 0x2c, 0x2c}, {0x40006b5d08?, 0x125774?, 0x53750c0?})
	/usr/local/go/src/testing/testing.go:2475 +0x3b8
testing.(*M).Run(0x4000688f00)
	/usr/local/go/src/testing/testing.go:2337 +0x530
k8s.io/minikube/test/integration.TestMain(0x4000688f00)
	/home/jenkins/workspace/Build_Cross/test/integration/main_test.go:64 +0xf0
main.main()
	_testmain.go:133 +0x88

                                                
                                                
goroutine 3952 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3951
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4303 [sync.Cond.Wait, 6 minutes]:
sync.runtime_notifyListWait(0x40019ec6d0, 0x1)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019ec6c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400047af00)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40016c2620?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x4001457ea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x4000474f38, {0x369e4a0, 0x400151c2d0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x4001457fa8?, {0x369e4a0?, 0x400151c2d0?}, 0x10?, 0x4001461200?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4004ea9a30, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4300
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5204 [chan receive, 6 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001a55440, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5202
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1592 [chan receive, 75 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4004ffe660, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1590
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1578 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x400143df40, 0x40013b7f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x30?, 0x400143df40, 0x400143df88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x400143dfa8?, 0x400224b2c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40004a8480?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1592
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4618 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40006b91d0, 0xf)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40006b91c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001625b60)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40017ba690?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x4001459ea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x400132cf38, {0x369e4a0, 0x4001836a20}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x4001459fa8?, {0x369e4a0?, 0x4001836a20?}, 0xb0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40025be440, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4616
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5188 [sync.Cond.Wait, 6 minutes]:
sync.runtime_notifyListWait(0x4001a30d50, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001a30d40)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001a55440)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001657e30?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x4001443ea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x40013b9f38, {0x369e4a0, 0x4001a88e40}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369e4a0?, 0x4001a88e40?}, 0x20?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001a92610, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5204
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3950 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40000dc910, 0x16)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40000dc900)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019b7200)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4004eee5b0?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x4001459ef8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x400132df38, {0x369e4a0, 0x40019ef050}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x369e4a0?, 0x40019ef050?}, 0xe0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40019e0c10, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3936
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3936 [chan receive, 28 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019b7200, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3931
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 160 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400047b020, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 152
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1965 [chan send, 73 minutes]:
os/exec.(*Cmd).watchCtx(0x40002a4600, 0x40016c24d0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1964
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4882 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x40014ffdc0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4881
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5997 [select]:
os/exec.(*Cmd).watchCtx(0x4001a7c000, 0x40018481c0)
	/usr/local/go/src/os/exec/exec.go:789 +0x70
created by os/exec.(*Cmd).Start in goroutine 5994
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 159 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 152
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 177 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 176
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 175 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40006b9390, 0x2d)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40006b9380)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400047b020)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4004eeeb60?, 0x24759c0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0xffff52cad450?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x4000473f38, {0x369e4a0, 0x4004f24810}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x400009f7a8?, {0x369e4a0?, 0x4004f24810?}, 0x50?, 0x40004bc4c8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400130fa10, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 160
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 6022 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x36e6528, 0x4001a3f720}, {0x36d45e0, 0x40017775a0}, 0x1, 0x0, 0x40014d5b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x36e6598?, 0x40002fe690?}, 0x3b9aca00, 0x40014d5d28?, 0x1, 0x40014d5b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x36e6598, 0x40002fe690}, 0x40013d6540, {0x40006d01e0, 0x11}, {0x2994202, 0x14}, {0x29ac171, 0x1c}, 0x7dba821800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:380 +0x22c
k8s.io/minikube/test/integration.validateAddonAfterStop({0x36e6598, 0x40002fe690}, 0x40013d6540, {0x40006d01e0, 0x11}, {0x297870e?, 0x1a901f7200161e84?}, {0x694362cf?, 0x400131af58?}, {0x161f08?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:285 +0xd4
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x40013d6540?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x40013d6540, 0x40004e4100)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4127
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3580 [chan receive]:
testing.(*T).Run(0x4000bfe1c0, {0x296d724?, 0x368ad80?}, 0x400069a480)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4000bfe1c0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:111 +0x4f4
testing.tRunner(0x4000bfe1c0, 0x40002b7200)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3544
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 176 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x40000a0f40, 0x400132af88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0xb0?, 0x40000a0f40, 0x40000a0f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001330000?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 160
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3544 [chan receive, 8 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x40014ff500, 0x40012faa50)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3214
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5190 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5189
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3578 [chan receive, 33 minutes]:
testing.(*testState).waitParallel(0x400071caf0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001b4dc00)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001b4dc00)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001b4dc00)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001b4dc00, 0x40002b6000)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3544
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4304 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x400009f740, 0x400009f788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x99?, 0x400009f740, 0x400009f788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x0?, 0x400009f750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x4000224080?, 0x400184c1c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4300
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5994 [syscall]:
syscall.Syscall6(0x5f, 0x3, 0x10, 0x4000108c38, 0x4, 0x400171e090, 0x0)
	/usr/local/go/src/syscall/syscall_linux.go:96 +0x2c
internal/syscall/unix.Waitid(0x4000108d98?, 0x1929a0?, 0xfffffca2519f?, 0x0?, 0x400183a000?)
	/usr/local/go/src/internal/syscall/unix/waitid_linux.go:18 +0x44
os.(*Process).pidfdWait.func1(...)
	/usr/local/go/src/os/pidfd_linux.go:109
os.ignoringEINTR(...)
	/usr/local/go/src/os/file_posix.go:256
os.(*Process).pidfdWait(0x40019ec080)
	/usr/local/go/src/os/pidfd_linux.go:108 +0x144
os.(*Process).wait(0x4000108d68?)
	/usr/local/go/src/os/exec_unix.go:25 +0x24
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:340
os/exec.(*Cmd).Wait(0x4001a7c000)
	/usr/local/go/src/os/exec/exec.go:922 +0x38
os/exec.(*Cmd).Run(0x4001a7c000)
	/usr/local/go/src/os/exec/exec.go:626 +0x38
k8s.io/minikube/test/integration.Run(0x4004f3b340, 0x4001a7c000)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:104 +0x154
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1.1(0x4004f3b340)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:112 +0x44
testing.tRunner(0x4004f3b340, 0x400069a480)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3580
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 2074 [chan send, 73 minutes]:
os/exec.(*Cmd).watchCtx(0x4001330180, 0x40016c2380)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1498
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4299 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x400184c1c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4298
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3951 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x4001385f40, 0x40013b6f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x88?, 0x4001385f40, 0x4001385f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x13?, 0x1?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001460900?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3936
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1577 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x40019eced0, 0x23)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019ecec0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4004ffe660)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400017bb90?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x40013b3f38, {0x369e4a0, 0x4001cb9a10}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x4001cb9a10?}, 0xb0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40019e1450, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1592
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3214 [chan receive, 33 minutes]:
testing.(*T).Run(0x4001b4c000, {0x296d71f?, 0x1a67d39a7bd0?}, 0x40012faa50)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins(0x4001b4c000)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:52 +0xe4
testing.tRunner(0x4001b4c000, 0x339bac8)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5189 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x400138a740, 0x400138a788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0xb8?, 0x400138a740, 0x400138a788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x0?, 0x400138a750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x4000224080?, 0x40004e88c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5204
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3516 [chan receive, 25 minutes]:
testing.(*T).Run(0x4001637a40, {0x296eb91?, 0x0?}, 0x40018be080)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x4001637a40)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x4001637a40, 0x40007d63c0)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3512
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3802 [chan receive, 30 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400047a360, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3790
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3812 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3811
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 914 [chan receive, 103 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4004fff3e0, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 848
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 685 [IO wait, 114 minutes]:
internal/poll.runtime_pollWait(0xffff52caf800, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x40002b7800?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x40002b7800)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x40002b7800)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x40007d6c40)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x40007d6c40)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40000fa700, {0x36d3f80, 0x40007d6c40})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40000fa700)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 683
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 5523 [chan receive, 4 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4004ffee40, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5502
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5526 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0x40025bdc10, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40025bdc00)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4004ffee40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4004eef650?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x400143f6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x4001318f38, {0x369e4a0, 0x4001360930}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x400143f7a8?, {0x369e4a0?, 0x4001360930?}, 0x0?, 0x4001460c00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40014ccb20, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5523
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3319 [chan receive, 34 minutes]:
testing.(*T).Run(0x4001b4c700, {0x296d71f?, 0x400132bf58?}, 0x339bcf8)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop(0x4001b4c700)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:46 +0x3c
testing.tRunner(0x4001b4c700, 0x339bb10)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4615 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x4001330780?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4611
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4300 [chan receive, 9 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400047af00, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4298
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1591 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x400184cfc0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1590
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5995 [IO wait]:
internal/poll.runtime_pollWait(0xffff5306cc00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001a620c0?, 0x40013a8b54?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001a620c0, {0x40013a8b54, 0x4ac, 0x4ac})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x400071a040, {0x40013a8b54?, 0x400143fd68?, 0x8b27c?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x400069a750, {0x369c878, 0x40000a6040})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x369ca60, 0x400069a750}, {0x369c878, 0x40000a6040}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x400071a040?, {0x369ca60, 0x400069a750})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x400071a040, {0x369ca60, 0x400069a750})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x369ca60, 0x400069a750}, {0x369c8f8, 0x400071a040}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x4004f3b340?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 5994
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 5854 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40025bde50, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40025bde40)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40016253e0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001aaef50?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x400143e6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x400131bf38, {0x369e4a0, 0x400135d7a0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369e4a0?, 0x400135d7a0?}, 0x0?, 0x36e6598?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40019e1860, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5851
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5522 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x40013f1dc0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5502
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1223 [select, 103 minutes]:
net/http.(*persistConn).readLoop(0x400189ea20)
	/usr/local/go/src/net/http/transport.go:2398 +0xa6c
created by net/http.(*Transport).dialConn in goroutine 1221
	/usr/local/go/src/net/http/transport.go:1947 +0x111c

                                                
                                                
goroutine 1111 [chan send, 103 minutes]:
os/exec.(*Cmd).watchCtx(0x40017c1e00, 0x40017bb570)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 828
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5850 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x4000104380?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5844
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 913 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x400141a380?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 848
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1059 [chan send, 103 minutes]:
os/exec.(*Cmd).watchCtx(0x40017c0a80, 0x40017ba7e0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1058
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4883 [chan receive, 6 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001571e00, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4881
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1114 [chan send, 103 minutes]:
os/exec.(*Cmd).watchCtx(0x4001976300, 0x40017bb7a0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1113
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5996 [IO wait]:
internal/poll.runtime_pollWait(0xffff5306d000, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001a62180?, 0x40016113d1?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001a62180, {0x40016113d1, 0x6c2f, 0x6c2f})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x400071a088, {0x40016113d1?, 0x4001440568?, 0x8b27c?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x400069a780, {0x369c878, 0x40000a6048})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x369ca60, 0x400069a780}, {0x369c878, 0x40000a6048}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x400071a088?, {0x369ca60, 0x400069a780})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x400071a088, {0x369ca60, 0x400069a780})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x369ca60, 0x400069a780}, {0x369c8f8, 0x400071a088}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x400184ddc0?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 5994
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 4898 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x40013e6740, 0x40013e6788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x8c?, 0x40013e6740, 0x40013e6788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x0?, 0x40013e6750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x4000224080?, 0x40014ffdc0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4883
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1579 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1578
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3810 [sync.Cond.Wait, 6 minutes]:
sync.runtime_notifyListWait(0x40006b8110, 0x16)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40006b8100)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400047a360)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40004e6230?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x4001496f38, {0x369e4a0, 0x400151d200}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x400151d200?}, 0x70?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001c72e70, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3802
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4616 [chan receive, 8 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001625b60, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4611
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1224 [select, 103 minutes]:
net/http.(*persistConn).writeLoop(0x400189ea20)
	/usr/local/go/src/net/http/transport.go:2600 +0x94
created by net/http.(*Transport).dialConn in goroutine 1221
	/usr/local/go/src/net/http/transport.go:1948 +0x1164

                                                
                                                
goroutine 2021 [chan send, 73 minutes]:
os/exec.(*Cmd).watchCtx(0x4001331b00, 0x40015edb20)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 2020
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4321 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4304
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4620 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4619
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5203 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x40004e88c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5202
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5528 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5527
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4619 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x400009ff40, 0x400009ff88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x45?, 0x400009ff40, 0x400009ff88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x0?, 0x400009ff50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x4000224080?, 0x4001330780?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4616
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5527 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x40000a0f40, 0x40000a0f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x48?, 0x40000a0f40, 0x40000a0f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x400162a480?, 0x40000017c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001b0e300?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5523
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1344 [IO wait, 103 minutes]:
internal/poll.runtime_pollWait(0xffff5306c800, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x40018be600?, 0xdbd0c?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x40018be600)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x40018be600)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x40019ec5c0)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x40019ec5c0)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40017be800, {0x36d3f80, 0x40019ec5c0})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40017be800)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 1342
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 907 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0x40025bc950, 0x29)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40025bc940)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4004fff3e0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40004d5500?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x400132ff38, {0x369e4a0, 0x4001304de0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f4250?, {0x369e4a0?, 0x4001304de0?}, 0xc0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40025bf970, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 914
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 908 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x40013ebf40, 0x4000470f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x31?, 0x40013ebf40, 0x40013ebf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x40002d5d60?, 0x400038a360?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400133c180?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 914
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4899 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4898
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 909 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 908
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5851 [chan receive, 2 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40016253e0, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5844
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3801 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x4001460300?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3790
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4897 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x40017609d0, 0xe)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40017609c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3702ae0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001571e00)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400175d730?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e6930?, 0x4000104380?}, 0x400175d6b8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e6930, 0x4000104380}, 0x400131df38, {0x369e4a0, 0x4004f25980}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369e4a0?, 0x4004f25980?}, 0x0?, 0x36e6598?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4002414950, 0x3b9aca00, 0x0, 0x1, 0x4000104380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4883
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3512 [chan receive, 8 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4001636e00, 0x339bcf8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3319
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3935 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36ff5e0, {{0x36f4250, 0x4000224080?}, 0x4001516f00?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3931
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4127 [chan receive]:
testing.(*T).Run(0x400184c000, {0x2994252?, 0x40000006ee?}, 0x40004e4100)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x400184c000)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x400184c000, 0x40018be080)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3516
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3811 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x400143ef40, 0x400143ef88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x38?, 0x400143ef40, 0x400143ef88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x1?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001330000?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3802
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5855 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e6930, 0x4000104380}, 0x4001443f40, 0x4001443f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e6930, 0x4000104380}, 0x21?, 0x4001443f40, 0x4001443f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e6930?, 0x4000104380?}, 0x0?, 0x4001443f50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f4250?, 0x4000224080?, 0x4000104380?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5851
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5856 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5855
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                    

Test pass (236/316)

Order passed test Duration
3 TestDownloadOnly/v1.28.0/json-events 6.98
4 TestDownloadOnly/v1.28.0/preload-exists 0
8 TestDownloadOnly/v1.28.0/LogsDuration 0.09
9 TestDownloadOnly/v1.28.0/DeleteAll 0.22
10 TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds 0.14
12 TestDownloadOnly/v1.34.3/json-events 6.24
13 TestDownloadOnly/v1.34.3/preload-exists 0
17 TestDownloadOnly/v1.34.3/LogsDuration 0.09
18 TestDownloadOnly/v1.34.3/DeleteAll 0.21
19 TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds 0.14
21 TestDownloadOnly/v1.35.0-rc.1/json-events 5.42
22 TestDownloadOnly/v1.35.0-rc.1/preload-exists 0
26 TestDownloadOnly/v1.35.0-rc.1/LogsDuration 0.08
27 TestDownloadOnly/v1.35.0-rc.1/DeleteAll 0.21
28 TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds 0.15
30 TestBinaryMirror 0.59
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.07
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.08
36 TestAddons/Setup 130.22
40 TestAddons/serial/GCPAuth/Namespaces 0.19
41 TestAddons/serial/GCPAuth/FakeCredentials 8.89
57 TestAddons/StoppedEnableDisable 12.43
58 TestCertOptions 39.47
59 TestCertExpiration 244.03
61 TestForceSystemdFlag 35.84
62 TestForceSystemdEnv 38.48
67 TestErrorSpam/setup 31.25
68 TestErrorSpam/start 0.76
69 TestErrorSpam/status 1.22
70 TestErrorSpam/pause 6.58
71 TestErrorSpam/unpause 6.29
72 TestErrorSpam/stop 1.51
75 TestFunctional/serial/CopySyncFile 0
76 TestFunctional/serial/StartWithProxy 49.78
77 TestFunctional/serial/AuditLog 0
79 TestFunctional/serial/KubeContext 0.05
83 TestFunctional/serial/CacheCmd/cache/add_remote 3.55
84 TestFunctional/serial/CacheCmd/cache/add_local 1.27
85 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.06
86 TestFunctional/serial/CacheCmd/cache/list 0.05
87 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.29
88 TestFunctional/serial/CacheCmd/cache/cache_reload 1.93
89 TestFunctional/serial/CacheCmd/cache/delete 0.11
92 TestFunctional/serial/ExtraConfig 35.48
93 TestFunctional/serial/ComponentHealth 0.09
94 TestFunctional/serial/LogsCmd 1.44
95 TestFunctional/serial/LogsFileCmd 1.48
96 TestFunctional/serial/InvalidService 4.3
98 TestFunctional/parallel/ConfigCmd 0.44
99 TestFunctional/parallel/DashboardCmd 11.56
100 TestFunctional/parallel/DryRun 0.43
101 TestFunctional/parallel/InternationalLanguage 0.22
102 TestFunctional/parallel/StatusCmd 1.16
106 TestFunctional/parallel/ServiceCmdConnect 7.61
107 TestFunctional/parallel/AddonsCmd 0.15
108 TestFunctional/parallel/PersistentVolumeClaim 19.94
110 TestFunctional/parallel/SSHCmd 0.71
111 TestFunctional/parallel/CpCmd 2.41
113 TestFunctional/parallel/FileSync 0.39
114 TestFunctional/parallel/CertSync 2.37
118 TestFunctional/parallel/NodeLabels 0.1
120 TestFunctional/parallel/NonActiveRuntimeDisabled 0.76
122 TestFunctional/parallel/License 0.3
124 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.65
125 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0
127 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 8.4
128 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.08
129 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0
133 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.11
134 TestFunctional/parallel/ServiceCmd/DeployApp 7.22
135 TestFunctional/parallel/ProfileCmd/profile_not_create 0.51
136 TestFunctional/parallel/ProfileCmd/profile_list 0.41
137 TestFunctional/parallel/ProfileCmd/profile_json_output 0.42
138 TestFunctional/parallel/MountCmd/any-port 8.54
139 TestFunctional/parallel/ServiceCmd/List 0.6
140 TestFunctional/parallel/ServiceCmd/JSONOutput 0.54
141 TestFunctional/parallel/ServiceCmd/HTTPS 0.39
142 TestFunctional/parallel/ServiceCmd/Format 0.38
143 TestFunctional/parallel/ServiceCmd/URL 0.5
144 TestFunctional/parallel/MountCmd/specific-port 2.25
145 TestFunctional/parallel/MountCmd/VerifyCleanup 1.76
146 TestFunctional/parallel/Version/short 0.09
147 TestFunctional/parallel/Version/components 1.33
148 TestFunctional/parallel/ImageCommands/ImageListShort 0.23
149 TestFunctional/parallel/ImageCommands/ImageListTable 0.28
150 TestFunctional/parallel/ImageCommands/ImageListJson 0.25
151 TestFunctional/parallel/ImageCommands/ImageListYaml 0.26
152 TestFunctional/parallel/ImageCommands/ImageBuild 3.98
153 TestFunctional/parallel/ImageCommands/Setup 0.68
154 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.77
155 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 0.86
156 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.28
157 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.46
158 TestFunctional/parallel/ImageCommands/ImageRemove 0.81
159 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.74
160 TestFunctional/parallel/UpdateContextCmd/no_changes 0.17
161 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.18
162 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.17
163 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.54
164 TestFunctional/delete_echo-server_images 0.04
165 TestFunctional/delete_my-image_image 0.02
166 TestFunctional/delete_minikube_cached_images 0.01
170 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile 0
172 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog 0
174 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext 0.05
178 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote 3.62
179 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local 1.07
180 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete 0.05
181 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list 0.05
182 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node 0.3
183 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload 1.78
184 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete 0.12
189 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd 0.92
190 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd 0.95
193 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd 0.5
195 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun 0.41
196 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage 0.2
202 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd 0.14
205 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd 0.77
206 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd 2.26
208 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync 0.28
209 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync 1.72
215 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled 0.67
217 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License 0.24
220 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel 0
227 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel 0.1
234 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create 0.42
235 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list 0.38
236 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output 0.4
238 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port 1.63
239 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup 2.06
240 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short 0.06
241 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components 0.49
242 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort 0.23
243 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable 0.23
244 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson 0.23
245 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml 0.23
246 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild 3.77
247 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup 0.25
248 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon 1.19
249 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon 0.82
250 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon 1.09
251 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile 0.39
252 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove 0.52
253 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile 0.76
254 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon 0.42
255 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes 0.14
256 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster 0.16
257 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters 0.14
258 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images 0.04
259 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image 0.01
260 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images 0.01
264 TestMultiControlPlane/serial/StartCluster 147.5
265 TestMultiControlPlane/serial/DeployApp 7.04
266 TestMultiControlPlane/serial/PingHostFromPods 1.5
267 TestMultiControlPlane/serial/AddWorkerNode 34.03
268 TestMultiControlPlane/serial/NodeLabels 0.13
269 TestMultiControlPlane/serial/HAppyAfterClusterStart 1.04
270 TestMultiControlPlane/serial/CopyFile 20.1
271 TestMultiControlPlane/serial/StopSecondaryNode 12.86
272 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.85
273 TestMultiControlPlane/serial/RestartSecondaryNode 30.32
274 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 1.36
275 TestMultiControlPlane/serial/RestartClusterKeepsNodes 124.8
276 TestMultiControlPlane/serial/DeleteSecondaryNode 11.69
277 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.79
278 TestMultiControlPlane/serial/StopCluster 36.1
279 TestMultiControlPlane/serial/RestartCluster 82.74
280 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.81
281 TestMultiControlPlane/serial/AddSecondaryNode 53.38
282 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 1.05
287 TestJSONOutput/start/Command 52.59
288 TestJSONOutput/start/Audit 0
290 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
291 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
294 TestJSONOutput/pause/Audit 0
296 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
297 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
300 TestJSONOutput/unpause/Audit 0
302 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
303 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
305 TestJSONOutput/stop/Command 5.81
306 TestJSONOutput/stop/Audit 0
308 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
309 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
310 TestErrorJSONOutput 0.24
312 TestKicCustomNetwork/create_custom_network 39.95
313 TestKicCustomNetwork/use_default_bridge_network 34.73
314 TestKicExistingNetwork 37.86
315 TestKicCustomSubnet 34.94
316 TestKicStaticIP 36.04
317 TestMainNoArgs 0.05
318 TestMinikubeProfile 70.31
321 TestMountStart/serial/StartWithMountFirst 9.15
322 TestMountStart/serial/VerifyMountFirst 0.27
323 TestMountStart/serial/StartWithMountSecond 8.59
324 TestMountStart/serial/VerifyMountSecond 0.28
325 TestMountStart/serial/DeleteFirst 1.72
326 TestMountStart/serial/VerifyMountPostDelete 0.28
327 TestMountStart/serial/Stop 1.29
328 TestMountStart/serial/RestartStopped 8.29
329 TestMountStart/serial/VerifyMountPostStop 0.27
332 TestMultiNode/serial/FreshStart2Nodes 80.99
333 TestMultiNode/serial/DeployApp2Nodes 5.12
334 TestMultiNode/serial/PingHostFrom2Pods 0.92
335 TestMultiNode/serial/AddNode 30.42
336 TestMultiNode/serial/MultiNodeLabels 0.09
337 TestMultiNode/serial/ProfileList 0.74
338 TestMultiNode/serial/CopyFile 10.51
339 TestMultiNode/serial/StopNode 2.46
340 TestMultiNode/serial/StartAfterStop 8.27
341 TestMultiNode/serial/RestartKeepsNodes 78.26
342 TestMultiNode/serial/DeleteNode 5.68
343 TestMultiNode/serial/StopMultiNode 24
344 TestMultiNode/serial/RestartMultiNode 47.41
345 TestMultiNode/serial/ValidateNameConflict 35.48
350 TestPreload 118.4
352 TestScheduledStopUnix 110.52
355 TestInsufficientStorage 13.22
356 TestRunningBinaryUpgrade 295.98
359 TestMissingContainerUpgrade 114.44
361 TestNoKubernetes/serial/StartNoK8sWithVersion 0.09
362 TestNoKubernetes/serial/StartWithK8s 45.83
363 TestNoKubernetes/serial/StartWithStopK8s 8.38
364 TestNoKubernetes/serial/Start 9.15
365 TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads 0
366 TestNoKubernetes/serial/VerifyK8sNotRunning 0.42
367 TestNoKubernetes/serial/ProfileList 3.19
368 TestNoKubernetes/serial/Stop 1.29
369 TestNoKubernetes/serial/StartNoArgs 7.07
370 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.28
371 TestStoppedBinaryUpgrade/Setup 1.04
372 TestStoppedBinaryUpgrade/Upgrade 303.32
373 TestStoppedBinaryUpgrade/MinikubeLogs 1.71
382 TestPause/serial/Start 54.09
383 TestPause/serial/SecondStartNoReconfiguration 28.05
x
+
TestDownloadOnly/v1.28.0/json-events (6.98s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-293945 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=crio --driver=docker  --container-runtime=crio
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-293945 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=crio --driver=docker  --container-runtime=crio: (6.976428622s)
--- PASS: TestDownloadOnly/v1.28.0/json-events (6.98s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/preload-exists
I1218 00:12:06.305121 1159552 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime crio
I1218 00:12:06.305198 1159552 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-cri-o-overlay-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.28.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-293945
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-293945: exit status 85 (92.044535ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬──────────┐
	│ COMMAND │                                                                                   ARGS                                                                                    │       PROFILE        │  USER   │ VERSION │     START TIME      │ END TIME │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼──────────┤
	│ start   │ -o=json --download-only -p download-only-293945 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=crio --driver=docker  --container-runtime=crio │ download-only-293945 │ jenkins │ v1.37.0 │ 18 Dec 25 00:11 UTC │          │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴──────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:11:59
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:11:59.369135 1159557 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:11:59.369264 1159557 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:11:59.369274 1159557 out.go:374] Setting ErrFile to fd 2...
	I1218 00:11:59.369279 1159557 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:11:59.369534 1159557 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	W1218 00:11:59.369663 1159557 root.go:314] Error reading config file at /home/jenkins/minikube-integration/22186-1156339/.minikube/config/config.json: open /home/jenkins/minikube-integration/22186-1156339/.minikube/config/config.json: no such file or directory
	I1218 00:11:59.370056 1159557 out.go:368] Setting JSON to true
	I1218 00:11:59.370865 1159557 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":24868,"bootTime":1765991852,"procs":151,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:11:59.370930 1159557 start.go:143] virtualization:  
	I1218 00:11:59.376447 1159557 out.go:99] [download-only-293945] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	W1218 00:11:59.376654 1159557 preload.go:354] Failed to list preload files: open /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball: no such file or directory
	I1218 00:11:59.376709 1159557 notify.go:221] Checking for updates...
	I1218 00:11:59.380128 1159557 out.go:171] MINIKUBE_LOCATION=22186
	I1218 00:11:59.383655 1159557 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:11:59.386873 1159557 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:11:59.389913 1159557 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:11:59.393015 1159557 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1218 00:11:59.398900 1159557 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1218 00:11:59.399142 1159557 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:11:59.431748 1159557 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:11:59.431944 1159557 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:11:59.489948 1159557 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-18 00:11:59.480806183 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:11:59.490051 1159557 docker.go:319] overlay module found
	I1218 00:11:59.493218 1159557 out.go:99] Using the docker driver based on user configuration
	I1218 00:11:59.493257 1159557 start.go:309] selected driver: docker
	I1218 00:11:59.493264 1159557 start.go:927] validating driver "docker" against <nil>
	I1218 00:11:59.493360 1159557 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:11:59.548092 1159557 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-18 00:11:59.538754287 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:11:59.548283 1159557 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 00:11:59.548555 1159557 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1218 00:11:59.548716 1159557 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1218 00:11:59.551979 1159557 out.go:171] Using Docker driver with root privileges
	I1218 00:11:59.555100 1159557 cni.go:84] Creating CNI manager for ""
	I1218 00:11:59.555178 1159557 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:11:59.555193 1159557 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 00:11:59.555287 1159557 start.go:353] cluster config:
	{Name:download-only-293945 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:download-only-293945 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:11:59.558348 1159557 out.go:99] Starting "download-only-293945" primary control-plane node in "download-only-293945" cluster
	I1218 00:11:59.558377 1159557 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:11:59.561350 1159557 out.go:99] Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:11:59.561404 1159557 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime crio
	I1218 00:11:59.561507 1159557 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:11:59.578742 1159557 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1218 00:11:59.578938 1159557 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory
	I1218 00:11:59.579033 1159557 image.go:150] Writing gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1218 00:11:59.629223 1159557 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-cri-o-overlay-arm64.tar.lz4
	I1218 00:11:59.629253 1159557 cache.go:65] Caching tarball of preloaded images
	I1218 00:11:59.629411 1159557 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime crio
	I1218 00:11:59.632765 1159557 out.go:99] Downloading Kubernetes v1.28.0 preload ...
	I1218 00:11:59.632797 1159557 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.28.0-cri-o-overlay-arm64.tar.lz4 from gcs api...
	I1218 00:11:59.721734 1159557 preload.go:295] Got checksum from GCS API "e092595ade89dbfc477bd4cd6b9c633b"
	I1218 00:11:59.721863 1159557 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-cri-o-overlay-arm64.tar.lz4?checksum=md5:e092595ade89dbfc477bd4cd6b9c633b -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-cri-o-overlay-arm64.tar.lz4
	
	
	* The control-plane node download-only-293945 host does not exist
	  To start a cluster, run: "minikube start -p download-only-293945"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.0/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-293945
--- PASS: TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/json-events (6.24s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-782051 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=crio --driver=docker  --container-runtime=crio
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-782051 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=crio --driver=docker  --container-runtime=crio: (6.237502269s)
--- PASS: TestDownloadOnly/v1.34.3/json-events (6.24s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/preload-exists
I1218 00:12:12.988674 1159552 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
I1218 00:12:12.988708 1159552 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.34.3/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-782051
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-782051: exit status 85 (88.483986ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                   ARGS                                                                                    │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-293945 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=crio --driver=docker  --container-runtime=crio │ download-only-293945 │ jenkins │ v1.37.0 │ 18 Dec 25 00:11 UTC │                     │
	│ delete  │ --all                                                                                                                                                                     │ minikube             │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-293945                                                                                                                                                   │ download-only-293945 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ -o=json --download-only -p download-only-782051 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=crio --driver=docker  --container-runtime=crio │ download-only-782051 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:12:06
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:12:06.793990 1159757 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:12:06.794505 1159757 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:06.794585 1159757 out.go:374] Setting ErrFile to fd 2...
	I1218 00:12:06.794638 1159757 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:06.794992 1159757 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:12:06.795494 1159757 out.go:368] Setting JSON to true
	I1218 00:12:06.796586 1159757 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":24875,"bootTime":1765991852,"procs":145,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:12:06.796717 1159757 start.go:143] virtualization:  
	I1218 00:12:06.800314 1159757 out.go:99] [download-only-782051] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:12:06.800788 1159757 notify.go:221] Checking for updates...
	I1218 00:12:06.803890 1159757 out.go:171] MINIKUBE_LOCATION=22186
	I1218 00:12:06.807035 1159757 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:12:06.810013 1159757 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:12:06.813003 1159757 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:12:06.816011 1159757 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1218 00:12:06.821567 1159757 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1218 00:12:06.821922 1159757 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:12:06.857076 1159757 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:12:06.857223 1159757 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:06.918573 1159757 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:48 SystemTime:2025-12-18 00:12:06.909295152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:06.918677 1159757 docker.go:319] overlay module found
	I1218 00:12:06.921583 1159757 out.go:99] Using the docker driver based on user configuration
	I1218 00:12:06.921618 1159757 start.go:309] selected driver: docker
	I1218 00:12:06.921628 1159757 start.go:927] validating driver "docker" against <nil>
	I1218 00:12:06.921738 1159757 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:06.978379 1159757 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:48 SystemTime:2025-12-18 00:12:06.96880729 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:06.978537 1159757 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 00:12:06.978830 1159757 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1218 00:12:06.978987 1159757 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1218 00:12:06.982089 1159757 out.go:171] Using Docker driver with root privileges
	I1218 00:12:06.984877 1159757 cni.go:84] Creating CNI manager for ""
	I1218 00:12:06.984941 1159757 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:12:06.984956 1159757 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 00:12:06.985032 1159757 start.go:353] cluster config:
	{Name:download-only-782051 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:download-only-782051 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:12:06.988006 1159757 out.go:99] Starting "download-only-782051" primary control-plane node in "download-only-782051" cluster
	I1218 00:12:06.988033 1159757 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:12:06.990933 1159757 out.go:99] Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:12:06.990991 1159757 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:06.991199 1159757 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:12:07.007637 1159757 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1218 00:12:07.007776 1159757 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory
	I1218 00:12:07.007801 1159757 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory, skipping pull
	I1218 00:12:07.007808 1159757 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in cache, skipping pull
	I1218 00:12:07.007816 1159757 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 as a tarball
	I1218 00:12:07.050427 1159757 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.3/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	I1218 00:12:07.050457 1159757 cache.go:65] Caching tarball of preloaded images
	I1218 00:12:07.050652 1159757 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime crio
	I1218 00:12:07.053775 1159757 out.go:99] Downloading Kubernetes v1.34.3 preload ...
	I1218 00:12:07.053807 1159757 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4 from gcs api...
	I1218 00:12:07.144134 1159757 preload.go:295] Got checksum from GCS API "c7c3cca4fcbe5ef642ca3e3e5575910e"
	I1218 00:12:07.144197 1159757 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.3/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4?checksum=md5:c7c3cca4fcbe5ef642ca3e3e5575910e -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-cri-o-overlay-arm64.tar.lz4
	
	
	* The control-plane node download-only-782051 host does not exist
	  To start a cluster, run: "minikube start -p download-only-782051"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.34.3/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/DeleteAll (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.34.3/DeleteAll (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-782051
--- PASS: TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/json-events (5.42s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-398668 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=crio --driver=docker  --container-runtime=crio
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-398668 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=crio --driver=docker  --container-runtime=crio: (5.420947994s)
--- PASS: TestDownloadOnly/v1.35.0-rc.1/json-events (5.42s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/preload-exists
I1218 00:12:18.848691 1159552 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
I1218 00:12:18.848728 1159552 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.35.0-rc.1/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-398668
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-398668: exit status 85 (79.873153ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                      ARGS                                                                                      │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-293945 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=crio --driver=docker  --container-runtime=crio      │ download-only-293945 │ jenkins │ v1.37.0 │ 18 Dec 25 00:11 UTC │                     │
	│ delete  │ --all                                                                                                                                                                          │ minikube             │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-293945                                                                                                                                                        │ download-only-293945 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ -o=json --download-only -p download-only-782051 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=crio --driver=docker  --container-runtime=crio      │ download-only-782051 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	│ delete  │ --all                                                                                                                                                                          │ minikube             │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ delete  │ -p download-only-782051                                                                                                                                                        │ download-only-782051 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │ 18 Dec 25 00:12 UTC │
	│ start   │ -o=json --download-only -p download-only-398668 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=crio --driver=docker  --container-runtime=crio │ download-only-398668 │ jenkins │ v1.37.0 │ 18 Dec 25 00:12 UTC │                     │
	└─────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/18 00:12:13
	Running on machine: ip-172-31-21-244
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1218 00:12:13.476581 1159960 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:12:13.476758 1159960 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:13.476784 1159960 out.go:374] Setting ErrFile to fd 2...
	I1218 00:12:13.476804 1159960 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:12:13.477188 1159960 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:12:13.477719 1159960 out.go:368] Setting JSON to true
	I1218 00:12:13.478712 1159960 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":24882,"bootTime":1765991852,"procs":145,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:12:13.478808 1159960 start.go:143] virtualization:  
	I1218 00:12:13.482428 1159960 out.go:99] [download-only-398668] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:12:13.482682 1159960 notify.go:221] Checking for updates...
	I1218 00:12:13.485475 1159960 out.go:171] MINIKUBE_LOCATION=22186
	I1218 00:12:13.488564 1159960 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:12:13.491457 1159960 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:12:13.494361 1159960 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:12:13.497345 1159960 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1218 00:12:13.503002 1159960 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1218 00:12:13.503299 1159960 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:12:13.535898 1159960 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:12:13.536006 1159960 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:13.597139 1159960 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-18 00:12:13.587313781 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:13.597245 1159960 docker.go:319] overlay module found
	I1218 00:12:13.600182 1159960 out.go:99] Using the docker driver based on user configuration
	I1218 00:12:13.600281 1159960 start.go:309] selected driver: docker
	I1218 00:12:13.600291 1159960 start.go:927] validating driver "docker" against <nil>
	I1218 00:12:13.600401 1159960 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:12:13.657796 1159960 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-18 00:12:13.647762858 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:12:13.657953 1159960 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1218 00:12:13.658236 1159960 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1218 00:12:13.658382 1159960 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1218 00:12:13.661442 1159960 out.go:171] Using Docker driver with root privileges
	I1218 00:12:13.664334 1159960 cni.go:84] Creating CNI manager for ""
	I1218 00:12:13.664404 1159960 cni.go:143] "docker" driver + "crio" runtime found, recommending kindnet
	I1218 00:12:13.664417 1159960 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1218 00:12:13.664498 1159960 start.go:353] cluster config:
	{Name:download-only-398668 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:download-only-398668 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.loc
al ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:12:13.667605 1159960 out.go:99] Starting "download-only-398668" primary control-plane node in "download-only-398668" cluster
	I1218 00:12:13.667624 1159960 cache.go:134] Beginning downloading kic base image for docker with crio
	I1218 00:12:13.670465 1159960 out.go:99] Pulling base image v0.0.48-1765966054-22186 ...
	I1218 00:12:13.670512 1159960 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:12:13.670685 1159960 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1218 00:12:13.687202 1159960 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1218 00:12:13.687346 1159960 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory
	I1218 00:12:13.687377 1159960 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory, skipping pull
	I1218 00:12:13.687383 1159960 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in cache, skipping pull
	I1218 00:12:13.687390 1159960 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 as a tarball
	I1218 00:12:13.730426 1159960 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-rc.1/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:12:13.730455 1159960 cache.go:65] Caching tarball of preloaded images
	I1218 00:12:13.730641 1159960 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:12:13.733824 1159960 out.go:99] Downloading Kubernetes v1.35.0-rc.1 preload ...
	I1218 00:12:13.733875 1159960 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4 from gcs api...
	I1218 00:12:13.815364 1159960 preload.go:295] Got checksum from GCS API "efae947990a69f0349b1b3fdbfa98de4"
	I1218 00:12:13.815416 1159960 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-rc.1/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4?checksum=md5:efae947990a69f0349b1b3fdbfa98de4 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-cri-o-overlay-arm64.tar.lz4
	I1218 00:12:18.245413 1159960 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on crio
	I1218 00:12:18.245806 1159960 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/download-only-398668/config.json ...
	I1218 00:12:18.245841 1159960 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/download-only-398668/config.json: {Name:mka404456feb862f3cd2d89930c5191aaf8d2611 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1218 00:12:18.246043 1159960 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime crio
	I1218 00:12:18.246206 1159960 download.go:108] Downloading: https://dl.k8s.io/release/v1.35.0-rc.1/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-rc.1/bin/linux/arm64/kubectl.sha256 -> /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/linux/arm64/v1.35.0-rc.1/kubectl
	
	
	* The control-plane node download-only-398668 host does not exist
	  To start a cluster, run: "minikube start -p download-only-398668"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.35.0-rc.1/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/DeleteAll (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.35.0-rc.1/DeleteAll (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds (0.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-398668
--- PASS: TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds (0.15s)

                                                
                                    
x
+
TestBinaryMirror (0.59s)

                                                
                                                
=== RUN   TestBinaryMirror
I1218 00:12:20.131395 1159552 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.3/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.34.3/bin/linux/arm64/kubectl.sha256
aaa_download_only_test.go:309: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p binary-mirror-966047 --alsologtostderr --binary-mirror http://127.0.0.1:45057 --driver=docker  --container-runtime=crio
helpers_test.go:176: Cleaning up "binary-mirror-966047" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p binary-mirror-966047
--- PASS: TestBinaryMirror (0.59s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.07s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1002: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-399099
addons_test.go:1002: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable dashboard -p addons-399099: exit status 85 (69.386749ms)

                                                
                                                
-- stdout --
	* Profile "addons-399099" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-399099"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.07s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1013: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-399099
addons_test.go:1013: (dbg) Non-zero exit: out/minikube-linux-arm64 addons disable dashboard -p addons-399099: exit status 85 (82.236073ms)

                                                
                                                
-- stdout --
	* Profile "addons-399099" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-399099"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/Setup (130.22s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-linux-arm64 start -p addons-399099 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=crio --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher
addons_test.go:110: (dbg) Done: out/minikube-linux-arm64 start -p addons-399099 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=crio --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher: (2m10.216706473s)
--- PASS: TestAddons/Setup (130.22s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.19s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:632: (dbg) Run:  kubectl --context addons-399099 create ns new-namespace
addons_test.go:646: (dbg) Run:  kubectl --context addons-399099 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.19s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/FakeCredentials (8.89s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/FakeCredentials
addons_test.go:677: (dbg) Run:  kubectl --context addons-399099 create -f testdata/busybox.yaml
addons_test.go:684: (dbg) Run:  kubectl --context addons-399099 create sa gcp-auth-test
addons_test.go:690: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:353: "busybox" [a171d19f-dfd3-4558-a4cf-0b54d6fc0c69] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:353: "busybox" [a171d19f-dfd3-4558-a4cf-0b54d6fc0c69] Running
addons_test.go:690: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: integration-test=busybox healthy within 8.004755339s
addons_test.go:696: (dbg) Run:  kubectl --context addons-399099 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:708: (dbg) Run:  kubectl --context addons-399099 describe sa gcp-auth-test
addons_test.go:722: (dbg) Run:  kubectl --context addons-399099 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:746: (dbg) Run:  kubectl --context addons-399099 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
--- PASS: TestAddons/serial/GCPAuth/FakeCredentials (8.89s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (12.43s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-linux-arm64 stop -p addons-399099
addons_test.go:174: (dbg) Done: out/minikube-linux-arm64 stop -p addons-399099: (12.163182802s)
addons_test.go:178: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-399099
addons_test.go:182: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-399099
addons_test.go:187: (dbg) Run:  out/minikube-linux-arm64 addons disable gvisor -p addons-399099
--- PASS: TestAddons/StoppedEnableDisable (12.43s)

                                                
                                    
x
+
TestCertOptions (39.47s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-arm64 start -p cert-options-734383 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=crio
cert_options_test.go:49: (dbg) Done: out/minikube-linux-arm64 start -p cert-options-734383 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=crio: (36.3809262s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-arm64 -p cert-options-734383 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-734383 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-arm64 ssh -p cert-options-734383 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:176: Cleaning up "cert-options-734383" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-options-734383
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p cert-options-734383: (2.332837233s)
--- PASS: TestCertOptions (39.47s)

                                                
                                    
x
+
TestCertExpiration (244.03s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-631298 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=crio
cert_options_test.go:123: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-631298 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=crio: (39.431267025s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-631298 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=crio
cert_options_test.go:131: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-631298 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=crio: (21.578293265s)
helpers_test.go:176: Cleaning up "cert-expiration-631298" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-expiration-631298
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p cert-expiration-631298: (3.019778159s)
--- PASS: TestCertExpiration (244.03s)

                                                
                                    
x
+
TestForceSystemdFlag (35.84s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-flag-843267 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=crio
docker_test.go:91: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-flag-843267 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=crio: (33.007013752s)
docker_test.go:132: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-flag-843267 ssh "cat /etc/crio/crio.conf.d/02-crio.conf"
helpers_test.go:176: Cleaning up "force-systemd-flag-843267" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-flag-843267
E1218 01:39:32.021038 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-flag-843267: (2.508495691s)
--- PASS: TestForceSystemdFlag (35.84s)

                                                
                                    
x
+
TestForceSystemdEnv (38.48s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-env-066220 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio
docker_test.go:155: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-env-066220 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio: (35.54540419s)
helpers_test.go:176: Cleaning up "force-systemd-env-066220" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-env-066220
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-env-066220: (2.931996904s)
--- PASS: TestForceSystemdEnv (38.48s)

                                                
                                    
x
+
TestErrorSpam/setup (31.25s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-arm64 start -p nospam-499800 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-499800 --driver=docker  --container-runtime=crio
error_spam_test.go:81: (dbg) Done: out/minikube-linux-arm64 start -p nospam-499800 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-499800 --driver=docker  --container-runtime=crio: (31.250785977s)
--- PASS: TestErrorSpam/setup (31.25s)

                                                
                                    
x
+
TestErrorSpam/start (0.76s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:206: Cleaning up 1 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 start --dry-run
--- PASS: TestErrorSpam/start (0.76s)

                                                
                                    
x
+
TestErrorSpam/status (1.22s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 status
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 status
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 status
--- PASS: TestErrorSpam/status (1.22s)

                                                
                                    
x
+
TestErrorSpam/pause (6.58s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause
error_spam_test.go:149: (dbg) Non-zero exit: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause: exit status 80 (2.288887503s)

                                                
                                                
-- stdout --
	* Pausing node nospam-499800 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_PAUSE: Pause: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:18:27Z" level=error msg="open /run/runc: no such file or directory"
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_delete_7f6b85125f52d8b6f2676a081a2b9f4eb5a7d9b1_1.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
error_spam_test.go:151: "out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause" failed: exit status 80
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause
error_spam_test.go:149: (dbg) Non-zero exit: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause: exit status 80 (1.961903232s)

                                                
                                                
-- stdout --
	* Pausing node nospam-499800 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_PAUSE: Pause: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:18:29Z" level=error msg="open /run/runc: no such file or directory"
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_delete_7f6b85125f52d8b6f2676a081a2b9f4eb5a7d9b1_1.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
error_spam_test.go:151: "out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause" failed: exit status 80
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause
error_spam_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause: exit status 80 (2.326491299s)

                                                
                                                
-- stdout --
	* Pausing node nospam-499800 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_PAUSE: Pause: list running: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:18:31Z" level=error msg="open /run/runc: no such file or directory"
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_delete_7f6b85125f52d8b6f2676a081a2b9f4eb5a7d9b1_1.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
error_spam_test.go:174: "out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 pause" failed: exit status 80
--- PASS: TestErrorSpam/pause (6.58s)

                                                
                                    
x
+
TestErrorSpam/unpause (6.29s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause
error_spam_test.go:149: (dbg) Non-zero exit: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause: exit status 80 (1.995363274s)

                                                
                                                
-- stdout --
	* Unpausing node nospam-499800 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_UNPAUSE: Pause: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:18:33Z" level=error msg="open /run/runc: no such file or directory"
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_delete_7f6b85125f52d8b6f2676a081a2b9f4eb5a7d9b1_1.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
error_spam_test.go:151: "out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause" failed: exit status 80
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause
error_spam_test.go:149: (dbg) Non-zero exit: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause: exit status 80 (1.984696313s)

                                                
                                                
-- stdout --
	* Unpausing node nospam-499800 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_UNPAUSE: Pause: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:18:35Z" level=error msg="open /run/runc: no such file or directory"
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_delete_7f6b85125f52d8b6f2676a081a2b9f4eb5a7d9b1_1.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
error_spam_test.go:151: "out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause" failed: exit status 80
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause
error_spam_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause: exit status 80 (2.308511436s)

                                                
                                                
-- stdout --
	* Unpausing node nospam-499800 ... 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_UNPAUSE: Pause: list paused: runc: sudo runc list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-18T00:18:37Z" level=error msg="open /run/runc: no such file or directory"
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_delete_7f6b85125f52d8b6f2676a081a2b9f4eb5a7d9b1_1.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
error_spam_test.go:174: "out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 unpause" failed: exit status 80
--- PASS: TestErrorSpam/unpause (6.29s)

                                                
                                    
x
+
TestErrorSpam/stop (1.51s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 stop
error_spam_test.go:149: (dbg) Done: out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 stop: (1.30605097s)
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 stop
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-499800 --log_dir /tmp/nospam-499800 stop
--- PASS: TestErrorSpam/stop (1.51s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (49.78s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-240845 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio
E1218 00:19:32.029768 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.036305 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.047773 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.069199 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.110553 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.191947 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.353407 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:32.675137 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 00:19:33.320399 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Done: out/minikube-linux-arm64 start -p functional-240845 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=crio: (49.77468019s)
--- PASS: TestFunctional/serial/StartWithProxy (49.78s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.05s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.55s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 cache add registry.k8s.io/pause:3.1: (1.237739029s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 cache add registry.k8s.io/pause:3.3: (1.1698748s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 cache add registry.k8s.io/pause:latest: (1.143034224s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.55s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.27s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-240845 /tmp/TestFunctionalserialCacheCmdcacheadd_local1024801609/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cache add minikube-local-cache-test:functional-240845
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cache delete minikube-local-cache-test:functional-240845
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-240845
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.27s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.05s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.29s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.29s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.93s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (313.629198ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.93s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.11s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.11s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (35.48s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-240845 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
functional_test.go:772: (dbg) Done: out/minikube-linux-arm64 start -p functional-240845 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (35.475421731s)
functional_test.go:776: restart took 35.475500637s for "functional-240845" cluster.
I1218 00:28:10.270946 1159552 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
--- PASS: TestFunctional/serial/ExtraConfig (35.48s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.09s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-240845 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:840: etcd phase: Running
functional_test.go:850: etcd status: Ready
functional_test.go:840: kube-apiserver phase: Running
functional_test.go:850: kube-apiserver status: Ready
functional_test.go:840: kube-controller-manager phase: Running
functional_test.go:850: kube-controller-manager status: Ready
functional_test.go:840: kube-scheduler phase: Running
functional_test.go:850: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.09s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.44s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 logs: (1.443631686s)
--- PASS: TestFunctional/serial/LogsCmd (1.44s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.48s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 logs --file /tmp/TestFunctionalserialLogsFileCmd1673284579/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 logs --file /tmp/TestFunctionalserialLogsFileCmd1673284579/001/logs.txt: (1.482277484s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.48s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.3s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-240845 apply -f testdata/invalidsvc.yaml
functional_test.go:2340: (dbg) Run:  out/minikube-linux-arm64 service invalid-svc -p functional-240845
functional_test.go:2340: (dbg) Non-zero exit: out/minikube-linux-arm64 service invalid-svc -p functional-240845: exit status 115 (387.455205ms)

                                                
                                                
-- stdout --
	┌───────────┬─────────────┬─────────────┬───────────────────────────┐
	│ NAMESPACE │    NAME     │ TARGET PORT │            URL            │
	├───────────┼─────────────┼─────────────┼───────────────────────────┤
	│ default   │ invalid-svc │ 80          │ http://192.168.49.2:31032 │
	└───────────┴─────────────┴─────────────┴───────────────────────────┘
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2332: (dbg) Run:  kubectl --context functional-240845 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.30s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 config get cpus: exit status 14 (92.124304ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 config get cpus: exit status 14 (68.018521ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.44s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (11.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-240845 --alsologtostderr -v=1]
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-240845 --alsologtostderr -v=1] ...
helpers_test.go:526: unable to kill pid 1186566: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (11.56s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-240845 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-240845 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio: exit status 23 (192.218555ms)

                                                
                                                
-- stdout --
	* [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:28:46.791077 1186286 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:28:46.791197 1186286 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:28:46.791202 1186286 out.go:374] Setting ErrFile to fd 2...
	I1218 00:28:46.791207 1186286 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:28:46.791999 1186286 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:28:46.792489 1186286 out.go:368] Setting JSON to false
	I1218 00:28:46.793319 1186286 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25875,"bootTime":1765991852,"procs":187,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:28:46.793382 1186286 start.go:143] virtualization:  
	I1218 00:28:46.796430 1186286 out.go:179] * [functional-240845] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:28:46.800158 1186286 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:28:46.800193 1186286 notify.go:221] Checking for updates...
	I1218 00:28:46.803251 1186286 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:28:46.806348 1186286 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:28:46.809167 1186286 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:28:46.811995 1186286 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:28:46.814896 1186286 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:28:46.818105 1186286 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:28:46.818654 1186286 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:28:46.851642 1186286 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:28:46.851779 1186286 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:28:46.917618 1186286 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-18 00:28:46.908628427 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:28:46.917725 1186286 docker.go:319] overlay module found
	I1218 00:28:46.920818 1186286 out.go:179] * Using the docker driver based on existing profile
	I1218 00:28:46.923532 1186286 start.go:309] selected driver: docker
	I1218 00:28:46.923550 1186286 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:28:46.923651 1186286 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:28:46.927230 1186286 out.go:203] 
	W1218 00:28:46.930284 1186286 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1218 00:28:46.933374 1186286 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-240845 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
--- PASS: TestFunctional/parallel/DryRun (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-240845 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-240845 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio: exit status 23 (215.626703ms)

                                                
                                                
-- stdout --
	* [functional-240845] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:28:46.592706 1186238 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:28:46.592950 1186238 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:28:46.592987 1186238 out.go:374] Setting ErrFile to fd 2...
	I1218 00:28:46.593004 1186238 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:28:46.594722 1186238 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:28:46.595240 1186238 out.go:368] Setting JSON to false
	I1218 00:28:46.596330 1186238 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":25875,"bootTime":1765991852,"procs":187,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:28:46.596428 1186238 start.go:143] virtualization:  
	I1218 00:28:46.599874 1186238 out.go:179] * [functional-240845] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1218 00:28:46.603753 1186238 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:28:46.603858 1186238 notify.go:221] Checking for updates...
	I1218 00:28:46.609535 1186238 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:28:46.612371 1186238 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:28:46.615303 1186238 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:28:46.618192 1186238 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:28:46.621094 1186238 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:28:46.624642 1186238 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 00:28:46.625233 1186238 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:28:46.662703 1186238 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:28:46.662833 1186238 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:28:46.725124 1186238 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-18 00:28:46.715218894 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:28:46.725228 1186238 docker.go:319] overlay module found
	I1218 00:28:46.728404 1186238 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1218 00:28:46.731269 1186238 start.go:309] selected driver: docker
	I1218 00:28:46.731288 1186238 start.go:927] validating driver "docker" against &{Name:functional-240845 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-240845 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:28:46.731391 1186238 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:28:46.734886 1186238 out.go:203] 
	W1218 00:28:46.737667 1186238 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1218 00:28:46.740544 1186238 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (1.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 status
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (1.16s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (7.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-240845 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1640: (dbg) Run:  kubectl --context functional-240845 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:353: "hello-node-connect-7d85dfc575-fqlpk" [da677ffc-22b1-4286-97c3-32ec55bfdca2] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:353: "hello-node-connect-7d85dfc575-fqlpk" [da677ffc-22b1-4286-97c3-32ec55bfdca2] Running
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 7.003549135s
functional_test.go:1654: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 service hello-node-connect --url
functional_test.go:1660: found endpoint for hello-node-connect: http://192.168.49.2:30605
functional_test.go:1680: http://192.168.49.2:30605: success! body:
Request served by hello-node-connect-7d85dfc575-fqlpk

                                                
                                                
HTTP/1.1 GET /

                                                
                                                
Host: 192.168.49.2:30605
Accept-Encoding: gzip
User-Agent: Go-http-client/1.1
--- PASS: TestFunctional/parallel/ServiceCmdConnect (7.61s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (19.94s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:353: "storage-provisioner" [36dc300a-a099-40d7-874e-e5c2b3795445] Running
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.006536249s
functional_test_pvc_test.go:55: (dbg) Run:  kubectl --context functional-240845 get storageclass -o=json
functional_test_pvc_test.go:75: (dbg) Run:  kubectl --context functional-240845 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:82: (dbg) Run:  kubectl --context functional-240845 get pvc myclaim -o=json
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-240845 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:353: "sp-pod" [683c87b1-96e7-4fe7-96f4-b0226da15672] Pending
helpers_test.go:353: "sp-pod" [683c87b1-96e7-4fe7-96f4-b0226da15672] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:353: "sp-pod" [683c87b1-96e7-4fe7-96f4-b0226da15672] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.003186183s
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-240845 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:112: (dbg) Run:  kubectl --context functional-240845 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-240845 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:353: "sp-pod" [c59bfead-ffe7-4659-9feb-b5f2b7d75311] Pending
helpers_test.go:353: "sp-pod" [c59bfead-ffe7-4659-9feb-b5f2b7d75311] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 6.003964222s
functional_test_pvc_test.go:120: (dbg) Run:  kubectl --context functional-240845 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (19.94s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.71s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (2.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh -n functional-240845 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cp functional-240845:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd1530818071/001/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh -n functional-240845 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh -n functional-240845 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (2.41s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/1159552/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /etc/test/nested/copy/1159552/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (2.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/1159552.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /etc/ssl/certs/1159552.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/1159552.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /usr/share/ca-certificates/1159552.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/11595522.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /etc/ssl/certs/11595522.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/11595522.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /usr/share/ca-certificates/11595522.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (2.37s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-240845 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.76s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh "sudo systemctl is-active docker": exit status 1 (342.065493ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo systemctl is-active containerd"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh "sudo systemctl is-active containerd": exit status 1 (415.810204ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.76s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctional/parallel/License (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-240845 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-240845 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-240845 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-240845 tunnel --alsologtostderr] ...
helpers_test.go:526: unable to kill pid 1184068: os: process already finished
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.65s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-240845 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-240845 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:353: "nginx-svc" [d3b85eba-7577-4e72-b46b-aea159552540] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "nginx-svc" [d3b85eba-7577-4e72-b46b-aea159552540] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 8.004573489s
I1218 00:28:27.787350 1159552 kapi.go:150] Service nginx-svc in namespace default found.
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.40s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-240845 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.08s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.101.219.144 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-240845 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: signal: terminated
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (7.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-240845 create deployment hello-node --image kicbase/echo-server
functional_test.go:1455: (dbg) Run:  kubectl --context functional-240845 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:353: "hello-node-75c85bcc94-h6kbk" [32acb95b-1359-478f-897a-b9dda4bd356d] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:353: "hello-node-75c85bcc94-h6kbk" [32acb95b-1359-478f-897a-b9dda4bd356d] Running
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 7.004682128s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (7.22s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "357.136272ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "49.101073ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "355.989716ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "61.770753ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.42s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (8.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdany-port4165054613/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1766017721271245136" to /tmp/TestFunctionalparallelMountCmdany-port4165054613/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1766017721271245136" to /tmp/TestFunctionalparallelMountCmdany-port4165054613/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1766017721271245136" to /tmp/TestFunctionalparallelMountCmdany-port4165054613/001/test-1766017721271245136
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (359.896391ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1218 00:28:41.632020 1159552 retry.go:31] will retry after 690.927744ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 18 00:28 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 18 00:28 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 18 00:28 test-1766017721271245136
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh cat /mount-9p/test-1766017721271245136
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-240845 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:353: "busybox-mount" [bbf3d2da-0291-48f0-8a0b-ff26ef198fab] Pending
helpers_test.go:353: "busybox-mount" [bbf3d2da-0291-48f0-8a0b-ff26ef198fab] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:353: "busybox-mount" [bbf3d2da-0291-48f0-8a0b-ff26ef198fab] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:353: "busybox-mount" [bbf3d2da-0291-48f0-8a0b-ff26ef198fab] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 5.017766723s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-240845 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdany-port4165054613/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (8.54s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.60s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 service list -o json
functional_test.go:1504: Took "539.389595ms" to run "out/minikube-linux-arm64 -p functional-240845 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 service --namespace=default --https --url hello-node
functional_test.go:1532: found endpoint: https://192.168.49.2:30342
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 service hello-node --url
functional_test.go:1575: found endpoint for hello-node: http://192.168.49.2:30342
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (2.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdspecific-port3103942619/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (578.514102ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1218 00:28:50.389299 1159552 retry.go:31] will retry after 354.679787ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdspecific-port3103942619/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh "sudo umount -f /mount-9p": exit status 1 (355.670707ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-240845 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdspecific-port3103942619/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (2.25s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.76s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdVerifyCleanup2347839077/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdVerifyCleanup2347839077/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdVerifyCleanup2347839077/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-240845 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdVerifyCleanup2347839077/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdVerifyCleanup2347839077/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-240845 /tmp/TestFunctionalparallelMountCmdVerifyCleanup2347839077/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.76s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 version --short
--- PASS: TestFunctional/parallel/Version/short (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (1.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 version -o=json --components
functional_test.go:2275: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 version -o=json --components: (1.329699399s)
--- PASS: TestFunctional/parallel/Version/components (1.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-240845 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.34.3
registry.k8s.io/kube-proxy:v1.34.3
registry.k8s.io/kube-controller-manager:v1.34.3
registry.k8s.io/kube-apiserver:v1.34.3
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.12.1
public.ecr.aws/nginx/nginx:alpine
localhost/minikube-local-cache-test:functional-240845
localhost/kicbase/echo-server:functional-240845
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:latest
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-240845 image ls --format short --alsologtostderr:
I1218 00:29:03.603938 1189330 out.go:360] Setting OutFile to fd 1 ...
I1218 00:29:03.604121 1189330 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.604149 1189330 out.go:374] Setting ErrFile to fd 2...
I1218 00:29:03.604174 1189330 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.604507 1189330 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:29:03.605146 1189330 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.605297 1189330 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.605935 1189330 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
I1218 00:29:03.624533 1189330 ssh_runner.go:195] Run: systemctl --version
I1218 00:29:03.624593 1189330 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
I1218 00:29:03.641410 1189330 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
I1218 00:29:03.751047 1189330 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-240845 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────┬───────────────────────────────────────┬───────────────┬────────┐
│                  IMAGE                  │                  TAG                  │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────┼───────────────────────────────────────┼───────────────┼────────┤
│ registry.k8s.io/coredns/coredns         │ v1.12.1                               │ 138784d87c9c5 │ 73.2MB │
│ registry.k8s.io/etcd                    │ 3.6.5-0                               │ 2c5f0dedd21c2 │ 60.9MB │
│ registry.k8s.io/pause                   │ 3.1                                   │ 8057e0500773a │ 529kB  │
│ registry.k8s.io/pause                   │ 3.10.1                                │ d7b100cd9a77b │ 520kB  │
│ gcr.io/k8s-minikube/busybox             │ 1.28.4-glibc                          │ 1611cd07b61d5 │ 3.77MB │
│ registry.k8s.io/kube-proxy              │ v1.34.3                               │ 4461daf6b6af8 │ 75.9MB │
│ registry.k8s.io/pause                   │ 3.3                                   │ 3d18732f8686c │ 487kB  │
│ registry.k8s.io/pause                   │ latest                                │ 8cb2091f603e7 │ 246kB  │
│ registry.k8s.io/kube-apiserver          │ v1.34.3                               │ cf65ae6c8f700 │ 84.8MB │
│ docker.io/kindest/kindnetd              │ v20251212-v0.29.0-alpha-105-g20ccfc88 │ c96ee3c174987 │ 108MB  │
│ localhost/minikube-local-cache-test     │ functional-240845                     │ 44f667f2a2d22 │ 3.33kB │
│ registry.k8s.io/kube-scheduler          │ v1.34.3                               │ 2f2aa21d34d2d │ 51.6MB │
│ gcr.io/k8s-minikube/storage-provisioner │ v5                                    │ ba04bb24b9575 │ 29MB   │
│ registry.k8s.io/kube-controller-manager │ v1.34.3                               │ 7ada8ff13e54b │ 72.6MB │
│ docker.io/kicbase/echo-server           │ latest                                │ ce2d2cda2d858 │ 4.79MB │
│ localhost/kicbase/echo-server           │ functional-240845                     │ ce2d2cda2d858 │ 4.79MB │
│ docker.io/kindest/kindnetd              │ v20250512-df8de77b                    │ b1a8c6f707935 │ 111MB  │
│ public.ecr.aws/nginx/nginx              │ alpine                                │ 10afed3caf3ee │ 55.1MB │
└─────────────────────────────────────────┴───────────────────────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-240845 image ls --format table --alsologtostderr:
I1218 00:29:03.337177 1189270 out.go:360] Setting OutFile to fd 1 ...
I1218 00:29:03.337336 1189270 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.337342 1189270 out.go:374] Setting ErrFile to fd 2...
I1218 00:29:03.337347 1189270 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.338374 1189270 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:29:03.339046 1189270 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.339165 1189270 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.339882 1189270 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
I1218 00:29:03.358934 1189270 ssh_runner.go:195] Run: systemctl --version
I1218 00:29:03.359012 1189270 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
I1218 00:29:03.390099 1189270 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
I1218 00:29:03.511111 1189270 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-240845 image ls --format json --alsologtostderr:
[{"id":"c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13","repoDigests":["docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae","docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3"],"repoTags":["docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"],"size":"108362109"},{"id":"a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a","repoDigests":["docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c","docker.io/kubernetesui/metrics-scraper@sha256:853c43f3cced687cb211708aa0024304a5adb33ec45ebf5915d318358822e09a"],"repoTags":[],"size":"42263767"},{"id":"138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc","repoDigests":["registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789","registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad04538
4401529721ddbe6463d31c"],"repoTags":["registry.k8s.io/coredns/coredns:v1.12.1"],"size":"73195387"},{"id":"2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534","registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"60857170"},{"id":"20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8","repoDigests":["docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93","docker.io/kubernetesui/dashboard@sha256:5c52c60663b473628bd98e4ffee7a747ef1f88d8c7bcee957b089fb3f61bdedf"],"repoTags":[],"size":"247562353"},{"id":"1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e","gcr.io/k8s-minikube/busybox@sha256:580b0aa58b210f512f81
8b7b7ef4f63c803f7a8cd6baf571b1462b79f7b7719e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"3774172"},{"id":"ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2","gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"29037500"},{"id":"8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":["registry.k8s.io/pause@sha256:b0602c9f938379133ff8017007894b48c1112681c9468f82a1e4cbf8a4498b67"],"repoTags":["registry.k8s.io/pause:3.1"],"size":"528622"},{"id":"8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":["registry.k8s.io/pause@sha256:f5e31d44aa14d5669e030380b656463a7e45934c03994e72e3dbf83d4a645cca"],"repoTags":["registry.k8s.io/pause:latest"],"size":"246070"},{"id":"cf65ae6c8f700
cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896","repoDigests":["registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460","registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5"],"repoTags":["registry.k8s.io/kube-apiserver:v1.34.3"],"size":"84818927"},{"id":"7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1","registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.34.3"],"size":"72629077"},{"id":"ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":["docker.io/kicbase/echo-server@sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6","docker.io/kicbase/echo-server@sha256:42a89d9b22e5307cb88494990d5d929c401339f508c
0a7e98a4d8ac52623fc5b","docker.io/kicbase/echo-server@sha256:49260110d6ce1914d3de292ed370ee11a2e34ab577b97e6011d795cb13534d4a","localhost/kicbase/echo-server@sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6","localhost/kicbase/echo-server@sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b","localhost/kicbase/echo-server@sha256:49260110d6ce1914d3de292ed370ee11a2e34ab577b97e6011d795cb13534d4a"],"repoTags":["docker.io/kicbase/echo-server:latest","localhost/kicbase/echo-server:functional-240845"],"size":"4789170"},{"id":"b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a","docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"111333938"},{"id":"44f667f2a2d226eee5b7856ffc5f5fd241d6cb411cdef1ea77a4fb56a10c7e37","repoDigests":["loc
alhost/minikube-local-cache-test@sha256:d310629568db80437e8ddeee8c149914f8611a18c2aef5ed089abc89e35abd01"],"repoTags":["localhost/minikube-local-cache-test:functional-240845"],"size":"3328"},{"id":"10afed3caf3eed1b711b8fa0a9600a7b488a45653a15a598a47ac570c1204cc4","repoDigests":["public.ecr.aws/nginx/nginx@sha256:2faa7e87b6fbce823070978247970cea2ad90b1936e84eeae1bd2680b03c168d","public.ecr.aws/nginx/nginx@sha256:9b0f84d48f92f2147217aec522219e9eda883a2836f1e30ab1915bd794f294ff"],"repoTags":["public.ecr.aws/nginx/nginx:alpine"],"size":"55077248"},{"id":"2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6","repoDigests":["registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611","registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"],"repoTags":["registry.k8s.io/kube-scheduler:v1.34.3"],"size":"51592021"},{"id":"d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.
io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c","registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"519884"},{"id":"4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162","repoDigests":["registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86","registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"],"repoTags":["registry.k8s.io/kube-proxy:v1.34.3"],"size":"75941783"},{"id":"3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":["registry.k8s.io/pause@sha256:e59730b14890252c14f85976e22ab1c47ec28b111ffed407f34bca1b44447476"],"repoTags":["registry.k8s.io/pause:3.3"],"size":"487479"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-240845 image ls --format json --alsologtostderr:
I1218 00:29:02.581819 1189072 out.go:360] Setting OutFile to fd 1 ...
I1218 00:29:02.582034 1189072 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:02.582057 1189072 out.go:374] Setting ErrFile to fd 2...
I1218 00:29:02.582076 1189072 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:02.582342 1189072 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:29:02.582984 1189072 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:02.583138 1189072 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:02.583709 1189072 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
I1218 00:29:02.605620 1189072 ssh_runner.go:195] Run: systemctl --version
I1218 00:29:02.605679 1189072 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
I1218 00:29:02.625557 1189072 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
I1218 00:29:02.743936 1189072 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-240845 image ls --format yaml --alsologtostderr:
- id: ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests:
- docker.io/kicbase/echo-server@sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6
- docker.io/kicbase/echo-server@sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b
- docker.io/kicbase/echo-server@sha256:49260110d6ce1914d3de292ed370ee11a2e34ab577b97e6011d795cb13534d4a
- localhost/kicbase/echo-server@sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6
- localhost/kicbase/echo-server@sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b
- localhost/kicbase/echo-server@sha256:49260110d6ce1914d3de292ed370ee11a2e34ab577b97e6011d795cb13534d4a
repoTags:
- docker.io/kicbase/echo-server:latest
- localhost/kicbase/echo-server:functional-240845
size: "4789170"
- id: b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
- docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "111333938"
- id: c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13
repoDigests:
- docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae
- docker.io/kindest/kindnetd@sha256:f1260f5691195cc9a693dc0b55178aa724d944efd62486a8320f0583272b1fa3
repoTags:
- docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88
size: "108362109"
- id: a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a
repoDigests:
- docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c
- docker.io/kubernetesui/metrics-scraper@sha256:853c43f3cced687cb211708aa0024304a5adb33ec45ebf5915d318358822e09a
repoTags: []
size: "42263767"
- id: ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "29037500"
- id: 1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
- gcr.io/k8s-minikube/busybox@sha256:580b0aa58b210f512f818b7b7ef4f63c803f7a8cd6baf571b1462b79f7b7719e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "3774172"
- id: 44f667f2a2d226eee5b7856ffc5f5fd241d6cb411cdef1ea77a4fb56a10c7e37
repoDigests:
- localhost/minikube-local-cache-test@sha256:d310629568db80437e8ddeee8c149914f8611a18c2aef5ed089abc89e35abd01
repoTags:
- localhost/minikube-local-cache-test:functional-240845
size: "3328"
- id: 10afed3caf3eed1b711b8fa0a9600a7b488a45653a15a598a47ac570c1204cc4
repoDigests:
- public.ecr.aws/nginx/nginx@sha256:2faa7e87b6fbce823070978247970cea2ad90b1936e84eeae1bd2680b03c168d
- public.ecr.aws/nginx/nginx@sha256:9b0f84d48f92f2147217aec522219e9eda883a2836f1e30ab1915bd794f294ff
repoTags:
- public.ecr.aws/nginx/nginx:alpine
size: "55077248"
- id: 8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests:
- registry.k8s.io/pause@sha256:b0602c9f938379133ff8017007894b48c1112681c9468f82a1e4cbf8a4498b67
repoTags:
- registry.k8s.io/pause:3.1
size: "528622"
- id: d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
- registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f
repoTags:
- registry.k8s.io/pause:3.10.1
size: "519884"
- id: 20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8
repoDigests:
- docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93
- docker.io/kubernetesui/dashboard@sha256:5c52c60663b473628bd98e4ffee7a747ef1f88d8c7bcee957b089fb3f61bdedf
repoTags: []
size: "247562353"
- id: 138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:4779e7517f375a597f100524db6f7f8b5b8499a6ccd14aacfa65432d4cfd5789
- registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c
repoTags:
- registry.k8s.io/coredns/coredns:v1.12.1
size: "73195387"
- id: 2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
- registry.k8s.io/etcd@sha256:0f87957e19b97d01b2c70813ee5c4949f8674deac4a65f7167c4cd85f7f2941e
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "60857170"
- id: cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460
- registry.k8s.io/kube-apiserver@sha256:6fa1e54cee33473ab964d87ea870ccf4ac9e6e4012b6d73160fcc3a99c7be9b5
repoTags:
- registry.k8s.io/kube-apiserver:v1.34.3
size: "84818927"
- id: 7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:49437795b4edd6ed8ada141b20cf576fb0aa4e84b82d6a25af841ed293abece1
- registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954
repoTags:
- registry.k8s.io/kube-controller-manager:v1.34.3
size: "72629077"
- id: 3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests:
- registry.k8s.io/pause@sha256:e59730b14890252c14f85976e22ab1c47ec28b111ffed407f34bca1b44447476
repoTags:
- registry.k8s.io/pause:3.3
size: "487479"
- id: 4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162
repoDigests:
- registry.k8s.io/kube-proxy@sha256:5c52b97ed657a0a1ef3c24e25d953fcca37fa200f3ec98938c254d748008dd86
- registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6
repoTags:
- registry.k8s.io/kube-proxy:v1.34.3
size: "75941783"
- id: 2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:7f3d992e0f2cb23d075ddafc8c73b5bdcf0ebc01098ef92965cc371eabcb9611
- registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2
repoTags:
- registry.k8s.io/kube-scheduler:v1.34.3
size: "51592021"
- id: 8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests:
- registry.k8s.io/pause@sha256:f5e31d44aa14d5669e030380b656463a7e45934c03994e72e3dbf83d4a645cca
repoTags:
- registry.k8s.io/pause:latest
size: "246070"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-240845 image ls --format yaml --alsologtostderr:
I1218 00:29:03.062906 1189205 out.go:360] Setting OutFile to fd 1 ...
I1218 00:29:03.063079 1189205 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.063109 1189205 out.go:374] Setting ErrFile to fd 2...
I1218 00:29:03.063128 1189205 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.063378 1189205 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:29:03.064106 1189205 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.064308 1189205 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.064860 1189205 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
I1218 00:29:03.092209 1189205 ssh_runner.go:195] Run: systemctl --version
I1218 00:29:03.092505 1189205 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
I1218 00:29:03.118646 1189205 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
I1218 00:29:03.236481 1189205 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (3.98s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-240845 ssh pgrep buildkitd: exit status 1 (306.103036ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr: (3.444007423s)
functional_test.go:335: (dbg) Stdout: out/minikube-linux-arm64 -p functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr:
STEP 1/3: FROM gcr.io/k8s-minikube/busybox
STEP 2/3: RUN true
--> f665b1cdf33
STEP 3/3: ADD content.txt /
COMMIT localhost/my-image:functional-240845
--> 6aba3d38f51
Successfully tagged localhost/my-image:functional-240845
6aba3d38f5153598294cb668aba2f0240a6f9459856ed31e4cd1e7335add2b53
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-240845 image build -t localhost/my-image:functional-240845 testdata/build --alsologtostderr:
I1218 00:29:03.159661 1189219 out.go:360] Setting OutFile to fd 1 ...
I1218 00:29:03.160911 1189219 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.160925 1189219 out.go:374] Setting ErrFile to fd 2...
I1218 00:29:03.160931 1189219 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:29:03.161197 1189219 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:29:03.161921 1189219 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.164436 1189219 config.go:182] Loaded profile config "functional-240845": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
I1218 00:29:03.165070 1189219 cli_runner.go:164] Run: docker container inspect functional-240845 --format={{.State.Status}}
I1218 00:29:03.186644 1189219 ssh_runner.go:195] Run: systemctl --version
I1218 00:29:03.186705 1189219 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-240845
I1218 00:29:03.204519 1189219 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33920 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-240845/id_rsa Username:docker}
I1218 00:29:03.311477 1189219 build_images.go:162] Building image from path: /tmp/build.3674311927.tar
I1218 00:29:03.311541 1189219 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1218 00:29:03.321501 1189219 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.3674311927.tar
I1218 00:29:03.326093 1189219 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.3674311927.tar: stat -c "%s %y" /var/lib/minikube/build/build.3674311927.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.3674311927.tar': No such file or directory
I1218 00:29:03.326129 1189219 ssh_runner.go:362] scp /tmp/build.3674311927.tar --> /var/lib/minikube/build/build.3674311927.tar (3072 bytes)
I1218 00:29:03.347713 1189219 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.3674311927
I1218 00:29:03.360759 1189219 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.3674311927 -xf /var/lib/minikube/build/build.3674311927.tar
I1218 00:29:03.371184 1189219 crio.go:315] Building image: /var/lib/minikube/build/build.3674311927
I1218 00:29:03.371268 1189219 ssh_runner.go:195] Run: sudo podman build -t localhost/my-image:functional-240845 /var/lib/minikube/build/build.3674311927 --cgroup-manager=cgroupfs
Trying to pull gcr.io/k8s-minikube/busybox:latest...
Getting image source signatures
Copying blob sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34
Copying config sha256:71a676dd070f4b701c3272e566d84951362f1326ea07d5bbad119d1c4f6b3d02
Writing manifest to image destination
Storing signatures
I1218 00:29:06.506266 1189219 ssh_runner.go:235] Completed: sudo podman build -t localhost/my-image:functional-240845 /var/lib/minikube/build/build.3674311927 --cgroup-manager=cgroupfs: (3.134976037s)
I1218 00:29:06.506331 1189219 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.3674311927
I1218 00:29:06.513968 1189219 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.3674311927.tar
I1218 00:29:06.521482 1189219 build_images.go:218] Built localhost/my-image:functional-240845 from /tmp/build.3674311927.tar
I1218 00:29:06.521514 1189219 build_images.go:134] succeeded building to: functional-240845
I1218 00:29:06.521534 1189219 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (3.98s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (0.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-240845
--- PASS: TestFunctional/parallel/ImageCommands/Setup (0.68s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.77s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image load --daemon kicbase/echo-server:functional-240845 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-240845 image load --daemon kicbase/echo-server:functional-240845 --alsologtostderr: (1.533016279s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.77s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (0.86s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image load --daemon kicbase/echo-server:functional-240845 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (0.86s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-240845
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image load --daemon kicbase/echo-server:functional-240845 --alsologtostderr
2025/12/18 00:28:58 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.28s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image save kicbase/echo-server:functional-240845 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.81s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image rm kicbase/echo-server:functional-240845 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.81s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.74s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.74s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-240845
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-240845 image save --daemon kicbase/echo-server:functional-240845 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect localhost/kicbase/echo-server:functional-240845
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.54s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-240845
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-240845
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.01s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-240845
--- PASS: TestFunctional/delete_minikube_cached_images (0.01s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/22186-1156339/.minikube/files/etc/test/nested/copy/1159552/hosts
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote (3.62s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 cache add registry.k8s.io/pause:3.1: (1.328783084s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 cache add registry.k8s.io/pause:3.3: (1.158983237s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 cache add registry.k8s.io/pause:latest: (1.132591678s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote (3.62s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local (1.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1serialCacheC1949460938/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cache add minikube-local-cache-test:functional-288604
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cache delete minikube-local-cache-test:functional-288604
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-288604
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local (1.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node (0.3s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh sudo crictl images
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node (0.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload (1.78s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (278.661655ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload (1.78s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete (0.12s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete (0.12s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd (0.92s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd (0.92s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd (0.95s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1serialLogsFi1017617952/001/logs.txt
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd (0.95s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd (0.5s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 config get cpus: exit status 14 (94.731881ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 config get cpus: exit status 14 (80.331028ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd (0.50s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun (0.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1: exit status 23 (180.158141ms)

                                                
                                                
-- stdout --
	* [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:58:39.412351 1219064 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:58:39.412550 1219064 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.412571 1219064 out.go:374] Setting ErrFile to fd 2...
	I1218 00:58:39.412592 1219064 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.412866 1219064 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:58:39.413247 1219064 out.go:368] Setting JSON to false
	I1218 00:58:39.414077 1219064 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":27668,"bootTime":1765991852,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:58:39.414165 1219064 start.go:143] virtualization:  
	I1218 00:58:39.417487 1219064 out.go:179] * [functional-288604] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1218 00:58:39.420544 1219064 notify.go:221] Checking for updates...
	I1218 00:58:39.421262 1219064 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:58:39.424181 1219064 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:58:39.426998 1219064 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:58:39.429888 1219064 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:58:39.432743 1219064 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:58:39.435562 1219064 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:58:39.438790 1219064 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:58:39.439361 1219064 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:58:39.462846 1219064 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:58:39.462964 1219064 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.516987 1219064 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.508360019 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.517095 1219064 docker.go:319] overlay module found
	I1218 00:58:39.520206 1219064 out.go:179] * Using the docker driver based on existing profile
	I1218 00:58:39.523211 1219064 start.go:309] selected driver: docker
	I1218 00:58:39.523231 1219064 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bin
aryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.523334 1219064 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:58:39.526857 1219064 out.go:203] 
	W1218 00:58:39.529751 1219064 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1218 00:58:39.532659 1219064 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-288604 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun (0.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage (0.2s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-288604 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=crio --kubernetes-version=v1.35.0-rc.1: exit status 23 (199.359972ms)

                                                
                                                
-- stdout --
	* [functional-288604] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 00:58:39.208745 1219018 out.go:360] Setting OutFile to fd 1 ...
	I1218 00:58:39.208917 1219018 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.208953 1219018 out.go:374] Setting ErrFile to fd 2...
	I1218 00:58:39.208966 1219018 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 00:58:39.209372 1219018 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 00:58:39.209779 1219018 out.go:368] Setting JSON to false
	I1218 00:58:39.210628 1219018 start.go:133] hostinfo: {"hostname":"ip-172-31-21-244","uptime":27668,"bootTime":1765991852,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"da8ac1fd-6236-412a-a346-95873c98230d"}
	I1218 00:58:39.210696 1219018 start.go:143] virtualization:  
	I1218 00:58:39.214262 1219018 out.go:179] * [functional-288604] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1218 00:58:39.217300 1219018 out.go:179]   - MINIKUBE_LOCATION=22186
	I1218 00:58:39.217361 1219018 notify.go:221] Checking for updates...
	I1218 00:58:39.223299 1219018 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1218 00:58:39.226226 1219018 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	I1218 00:58:39.229064 1219018 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	I1218 00:58:39.232001 1219018 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1218 00:58:39.234881 1219018 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1218 00:58:39.238121 1219018 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
	I1218 00:58:39.238696 1219018 driver.go:422] Setting default libvirt URI to qemu:///system
	I1218 00:58:39.274062 1219018 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1218 00:58:39.274192 1219018 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 00:58:39.336418 1219018 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-18 00:58:39.32609817 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 00:58:39.336523 1219018 docker.go:319] overlay module found
	I1218 00:58:39.339542 1219018 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1218 00:58:39.342324 1219018 start.go:309] selected driver: docker
	I1218 00:58:39.342349 1219018 start.go:927] validating driver "docker" against &{Name:functional-288604 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-288604 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:crio CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:crio ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bin
aryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1218 00:58:39.342459 1219018 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1218 00:58:39.345849 1219018 out.go:203] 
	W1218 00:58:39.348638 1219018 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1218 00:58:39.351444 1219018 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage (0.20s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 addons list -o json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd (0.77s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "cat /etc/hostname"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd (0.77s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd (2.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh -n functional-288604 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cp functional-288604:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelCpCm3792136038/001/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh -n functional-288604 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh -n functional-288604 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd (2.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/1159552/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /etc/test/nested/copy/1159552/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync (1.72s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/1159552.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /etc/ssl/certs/1159552.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/1159552.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /usr/share/ca-certificates/1159552.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/11595522.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /etc/ssl/certs/11595522.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/11595522.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /usr/share/ca-certificates/11595522.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync (1.72s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled (0.67s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "sudo systemctl is-active docker": exit status 1 (291.22111ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo systemctl is-active containerd"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "sudo systemctl is-active containerd": exit status 1 (382.75923ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled (0.67s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License (0.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License (0.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr]
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-288604 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: exit status 103
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create (0.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create (0.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list (0.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "326.942521ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "55.207242ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list (0.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output (0.4s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "333.676329ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "65.55879ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output (0.40s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port (1.63s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3660484500/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (343.567616ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1218 00:58:32.727576 1159552 retry.go:31] will retry after 261.342203ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3660484500/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "sudo umount -f /mount-9p": exit status 1 (262.985955ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-288604 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3660484500/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port (1.63s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup (2.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T" /mount1: exit status 1 (612.682964ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1218 00:58:34.625501 1159552 retry.go:31] will retry after 588.111539ms: exit status 1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-288604 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-288604 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1391996888/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup (2.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 version --short
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components (0.49s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 version -o=json --components
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components (0.49s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-288604 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.35.0-rc.1
registry.k8s.io/kube-proxy:v1.35.0-rc.1
registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
registry.k8s.io/kube-apiserver:v1.35.0-rc.1
registry.k8s.io/etcd:3.6.6-0
registry.k8s.io/coredns/coredns:v1.13.1
localhost/minikube-local-cache-test:functional-288604
localhost/kicbase/echo-server:functional-288604
gcr.io/k8s-minikube/storage-provisioner:v5
docker.io/kindest/kindnetd:v20250512-df8de77b
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-288604 image ls --format short --alsologtostderr:
I1218 00:58:51.863645 1221214 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:51.863810 1221214 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:51.863837 1221214 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:51.863854 1221214 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:51.864128 1221214 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:51.864790 1221214 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:51.864968 1221214 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:51.865557 1221214 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:51.883565 1221214 ssh_runner.go:195] Run: systemctl --version
I1218 00:58:51.883633 1221214 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:51.900339 1221214 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
I1218 00:58:52.009034 1221214 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-288604 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────┬────────────────────┬───────────────┬────────┐
│                  IMAGE                  │        TAG         │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────┼────────────────────┼───────────────┼────────┤
│ registry.k8s.io/pause                   │ 3.3                │ 3d18732f8686c │ 487kB  │
│ gcr.io/k8s-minikube/storage-provisioner │ v5                 │ ba04bb24b9575 │ 29MB   │
│ localhost/minikube-local-cache-test     │ functional-288604  │ 44f667f2a2d22 │ 3.33kB │
│ registry.k8s.io/coredns/coredns         │ v1.13.1            │ e08f4d9d2e6ed │ 74.5MB │
│ registry.k8s.io/kube-apiserver          │ v1.35.0-rc.1       │ 3c6ba27e07aef │ 85MB   │
│ registry.k8s.io/pause                   │ latest             │ 8cb2091f603e7 │ 246kB  │
│ docker.io/kindest/kindnetd              │ v20250512-df8de77b │ b1a8c6f707935 │ 111MB  │
│ localhost/kicbase/echo-server           │ functional-288604  │ ce2d2cda2d858 │ 4.79MB │
│ registry.k8s.io/etcd                    │ 3.6.6-0            │ 271e49a0ebc56 │ 60.9MB │
│ registry.k8s.io/pause                   │ 3.10.1             │ d7b100cd9a77b │ 520kB  │
│ gcr.io/k8s-minikube/busybox             │ latest             │ 71a676dd070f4 │ 1.63MB │
│ localhost/my-image                      │ functional-288604  │ fb9bf8f7eb7b8 │ 1.64MB │
│ registry.k8s.io/kube-controller-manager │ v1.35.0-rc.1       │ a34b3483f25ba │ 72.2MB │
│ registry.k8s.io/kube-proxy              │ v1.35.0-rc.1       │ 7e3acea3d87aa │ 74.1MB │
│ registry.k8s.io/kube-scheduler          │ v1.35.0-rc.1       │ abca4d5226620 │ 49.8MB │
│ registry.k8s.io/pause                   │ 3.1                │ 8057e0500773a │ 529kB  │
└─────────────────────────────────────────┴────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-288604 image ls --format table --alsologtostderr:
I1218 00:58:56.328134 1221710 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:56.328339 1221710 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:56.328366 1221710 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:56.328385 1221710 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:56.328771 1221710 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:56.330029 1221710 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:56.330180 1221710 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:56.330702 1221710 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:56.346971 1221710 ssh_runner.go:195] Run: systemctl --version
I1218 00:58:56.347027 1221710 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:56.363277 1221710 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
I1218 00:58:56.466580 1221710 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-288604 image ls --format json --alsologtostderr:
[{"id":"271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57","repoDigests":["registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890","registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561"],"repoTags":["registry.k8s.io/etcd:3.6.6-0"],"size":"60850387"},{"id":"7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e","repoDigests":["registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f","registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"],"repoTags":["registry.k8s.io/kube-proxy:v1.35.0-rc.1"],"size":"74107287"},{"id":"abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde","repoDigests":["registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3","registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0"],"repoTags
":["registry.k8s.io/kube-scheduler:v1.35.0-rc.1"],"size":"49822549"},{"id":"ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2","gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"29037500"},{"id":"8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":["registry.k8s.io/pause@sha256:f5e31d44aa14d5669e030380b656463a7e45934c03994e72e3dbf83d4a645cca"],"repoTags":["registry.k8s.io/pause:latest"],"size":"246070"},{"id":"f5cfd1d7f9a5da7cb77ed440e3264765c749b06bf686a297965c40905dd39555","repoDigests":["docker.io/library/44a6927fde7836fa447ca81ef334d9f40239fe55d44dc16ef995c326a380a8af-tmp@sha256:3f132b6901893f34e7912f5dd5d119a565e19e832fb03b1096a6226ad126a48a"],"repoTags":[],"size":"1638178"},{"id":"71a676dd070f4b701c3272e566
d84951362f1326ea07d5bbad119d1c4f6b3d02","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:a77fe109c026308f149d36484d795b42efe0fd29b332be9071f63e1634c36ac9","gcr.io/k8s-minikube/busybox@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b"],"repoTags":["gcr.io/k8s-minikube/busybox:latest"],"size":"1634527"},{"id":"ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":["localhost/kicbase/echo-server@sha256:49260110d6ce1914d3de292ed370ee11a2e34ab577b97e6011d795cb13534d4a"],"repoTags":["localhost/kicbase/echo-server:functional-288604"],"size":"4788229"},{"id":"e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf","repoDigests":["registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6","registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74"],"repoTags":["registry.k8s.io/coredns/coredns:v1.13.1"],"size":"74491780"},{"id":"3c6ba27e07aef16adb050828695bfe6206439147
b9ade2a2a1777c276bf79a54","repoDigests":["registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee","registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8"],"repoTags":["registry.k8s.io/kube-apiserver:v1.35.0-rc.1"],"size":"85015535"},{"id":"8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":["registry.k8s.io/pause@sha256:b0602c9f938379133ff8017007894b48c1112681c9468f82a1e4cbf8a4498b67"],"repoTags":["registry.k8s.io/pause:3.1"],"size":"528622"},{"id":"3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":["registry.k8s.io/pause@sha256:e59730b14890252c14f85976e22ab1c47ec28b111ffed407f34bca1b44447476"],"repoTags":["registry.k8s.io/pause:3.3"],"size":"487479"},{"id":"b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a","docker.io/kindest/
kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"111333938"},{"id":"a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f","registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"],"size":"72170325"},{"id":"d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c","registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"519884"},{"id":"44f667f2a2d226eee5b7856ffc5f5fd241d6cb411cdef1ea77a4fb56a10c7e37","repoDigests":["localhost/minikube-local-
cache-test@sha256:d310629568db80437e8ddeee8c149914f8611a18c2aef5ed089abc89e35abd01"],"repoTags":["localhost/minikube-local-cache-test:functional-288604"],"size":"3328"},{"id":"fb9bf8f7eb7b86b99777085829252dde81f36a36f2a90add2039a02a7b109ae7","repoDigests":["localhost/my-image@sha256:e3374226f0659cc8bd7d32d9399919a8d492b309ffded53cb706d070ab54c2d0"],"repoTags":["localhost/my-image:functional-288604"],"size":"1640791"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-288604 image ls --format json --alsologtostderr:
I1218 00:58:56.097461 1221673 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:56.097607 1221673 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:56.097632 1221673 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:56.097655 1221673 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:56.097921 1221673 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:56.098581 1221673 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:56.098760 1221673 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:56.099357 1221673 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:56.117166 1221673 ssh_runner.go:195] Run: systemctl --version
I1218 00:58:56.117230 1221673 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:56.134121 1221673 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
I1218 00:58:56.238286 1221673 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-288604 image ls --format yaml --alsologtostderr:
- id: 271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57
repoDigests:
- registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890
- registry.k8s.io/etcd@sha256:aa0d8bc8f6a6c3655b8efe0a10c5bf052f5574ebe13f904c5b0c9002ce4b2561
repoTags:
- registry.k8s.io/etcd:3.6.6-0
size: "60850387"
- id: 3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee
- registry.k8s.io/kube-apiserver@sha256:e6ee3594f9ff061c53d6721bc04b810ec4227e28da3bd98e59206d552d45cde8
repoTags:
- registry.k8s.io/kube-apiserver:v1.35.0-rc.1
size: "85015535"
- id: 7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e
repoDigests:
- registry.k8s.io/kube-proxy@sha256:709cbcd809826ad98b553d8e283a04db70fa653526d1c2a5e1b50000701b2b6f
- registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5
repoTags:
- registry.k8s.io/kube-proxy:v1.35.0-rc.1
size: "74107287"
- id: 8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests:
- registry.k8s.io/pause@sha256:b0602c9f938379133ff8017007894b48c1112681c9468f82a1e4cbf8a4498b67
repoTags:
- registry.k8s.io/pause:3.1
size: "528622"
- id: 8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests:
- registry.k8s.io/pause@sha256:f5e31d44aa14d5669e030380b656463a7e45934c03994e72e3dbf83d4a645cca
repoTags:
- registry.k8s.io/pause:latest
size: "246070"
- id: b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
- docker.io/kindest/kindnetd@sha256:2bdc3188f2ddc8e54841f69ef900a8dde1280057c97500f966a7ef31364021f1
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "111333938"
- id: ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests:
- localhost/kicbase/echo-server@sha256:49260110d6ce1914d3de292ed370ee11a2e34ab577b97e6011d795cb13534d4a
repoTags:
- localhost/kicbase/echo-server:functional-288604
size: "4788229"
- id: 44f667f2a2d226eee5b7856ffc5f5fd241d6cb411cdef1ea77a4fb56a10c7e37
repoDigests:
- localhost/minikube-local-cache-test@sha256:d310629568db80437e8ddeee8c149914f8611a18c2aef5ed089abc89e35abd01
repoTags:
- localhost/minikube-local-cache-test:functional-288604
size: "3328"
- id: e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6
- registry.k8s.io/coredns/coredns@sha256:cbd225373d1800b8d9aa2cac02d5be4172ad301cf7a1ffb509ddf8ca1fe06d74
repoTags:
- registry.k8s.io/coredns/coredns:v1.13.1
size: "74491780"
- id: a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:42360249c0c729ed0542bc8e4a6cd9ba4df358a4e5a140f6c24d5f966ee5121f
- registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98
repoTags:
- registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
size: "72170325"
- id: abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3
- registry.k8s.io/kube-scheduler@sha256:9ac9664e74153a60bf2c27af77561abc33d85a716a48893c7e50ad356adc4ea0
repoTags:
- registry.k8s.io/kube-scheduler:v1.35.0-rc.1
size: "49822549"
- id: d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
- registry.k8s.io/pause@sha256:e9c466420bcaeede00f46ecfa0ca8cd854c549f2f13330e2723173d88f2de70f
repoTags:
- registry.k8s.io/pause:3.10.1
size: "519884"
- id: 3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests:
- registry.k8s.io/pause@sha256:e59730b14890252c14f85976e22ab1c47ec28b111ffed407f34bca1b44447476
repoTags:
- registry.k8s.io/pause:3.3
size: "487479"
- id: ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:0ba370588274b88531ab311a5d2e645d240a853555c1e58fd1dd428fc333c9d2
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "29037500"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-288604 image ls --format yaml --alsologtostderr:
I1218 00:58:52.097500 1221250 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:52.097660 1221250 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:52.097686 1221250 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:52.097709 1221250 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:52.097981 1221250 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:52.098677 1221250 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:52.098838 1221250 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:52.099422 1221250 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:52.116876 1221250 ssh_runner.go:195] Run: systemctl --version
I1218 00:58:52.116934 1221250 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:52.133318 1221250 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
I1218 00:58:52.246587 1221250 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild (3.77s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-288604 ssh pgrep buildkitd: exit status 1 (267.778225ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image build -t localhost/my-image:functional-288604 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-288604 image build -t localhost/my-image:functional-288604 testdata/build --alsologtostderr: (3.258202762s)
functional_test.go:335: (dbg) Stdout: out/minikube-linux-arm64 -p functional-288604 image build -t localhost/my-image:functional-288604 testdata/build --alsologtostderr:
STEP 1/3: FROM gcr.io/k8s-minikube/busybox
STEP 2/3: RUN true
--> f5cfd1d7f9a
STEP 3/3: ADD content.txt /
COMMIT localhost/my-image:functional-288604
--> fb9bf8f7eb7
Successfully tagged localhost/my-image:functional-288604
fb9bf8f7eb7b86b99777085829252dde81f36a36f2a90add2039a02a7b109ae7
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-288604 image build -t localhost/my-image:functional-288604 testdata/build --alsologtostderr:
I1218 00:58:52.598823 1221356 out.go:360] Setting OutFile to fd 1 ...
I1218 00:58:52.598943 1221356 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:52.598961 1221356 out.go:374] Setting ErrFile to fd 2...
I1218 00:58:52.598967 1221356 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1218 00:58:52.599319 1221356 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
I1218 00:58:52.600282 1221356 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:52.601551 1221356 config.go:182] Loaded profile config "functional-288604": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.35.0-rc.1
I1218 00:58:52.602206 1221356 cli_runner.go:164] Run: docker container inspect functional-288604 --format={{.State.Status}}
I1218 00:58:52.620396 1221356 ssh_runner.go:195] Run: systemctl --version
I1218 00:58:52.620458 1221356 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-288604
I1218 00:58:52.639295 1221356 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33925 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/functional-288604/id_rsa Username:docker}
I1218 00:58:52.742851 1221356 build_images.go:162] Building image from path: /tmp/build.701022457.tar
I1218 00:58:52.742934 1221356 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1218 00:58:52.751037 1221356 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.701022457.tar
I1218 00:58:52.754634 1221356 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.701022457.tar: stat -c "%s %y" /var/lib/minikube/build/build.701022457.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.701022457.tar': No such file or directory
I1218 00:58:52.754664 1221356 ssh_runner.go:362] scp /tmp/build.701022457.tar --> /var/lib/minikube/build/build.701022457.tar (3072 bytes)
I1218 00:58:52.772026 1221356 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.701022457
I1218 00:58:52.779963 1221356 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.701022457 -xf /var/lib/minikube/build/build.701022457.tar
I1218 00:58:52.787998 1221356 crio.go:315] Building image: /var/lib/minikube/build/build.701022457
I1218 00:58:52.788093 1221356 ssh_runner.go:195] Run: sudo podman build -t localhost/my-image:functional-288604 /var/lib/minikube/build/build.701022457 --cgroup-manager=cgroupfs
Trying to pull gcr.io/k8s-minikube/busybox:latest...
Getting image source signatures
Copying blob sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34
Copying config sha256:71a676dd070f4b701c3272e566d84951362f1326ea07d5bbad119d1c4f6b3d02
Writing manifest to image destination
Storing signatures
I1218 00:58:55.779262 1221356 ssh_runner.go:235] Completed: sudo podman build -t localhost/my-image:functional-288604 /var/lib/minikube/build/build.701022457 --cgroup-manager=cgroupfs: (2.99113819s)
I1218 00:58:55.779337 1221356 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.701022457
I1218 00:58:55.787947 1221356 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.701022457.tar
I1218 00:58:55.797726 1221356 build_images.go:218] Built localhost/my-image:functional-288604 from /tmp/build.701022457.tar
I1218 00:58:55.797753 1221356 build_images.go:134] succeeded building to: functional-288604
I1218 00:58:55.797758 1221356 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild (3.77s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-288604
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon (1.19s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon (1.19s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon (0.82s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon (0.82s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon (1.09s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-288604
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image load --daemon kicbase/echo-server:functional-288604 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon (1.09s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile (0.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image save kicbase/echo-server:functional-288604 /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile (0.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove (0.52s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image rm kicbase/echo-server:functional-288604 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove (0.52s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile (0.76s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image load /home/jenkins/workspace/Docker_Linux_crio_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile (0.76s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon (0.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-288604
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 image save --daemon kicbase/echo-server:functional-288604 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect localhost/kicbase/echo-server:functional-288604
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon (0.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster (0.16s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster (0.16s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-288604 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-288604
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image (0.01s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-288604
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image (0.01s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images (0.01s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-288604
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images (0.01s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (147.5s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=crio
E1218 01:01:17.040414 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.046754 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.058069 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.079422 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.120780 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.202127 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.363470 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:17.685080 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:18.327087 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:19.608852 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:22.170605 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:27.292754 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:37.534178 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:01:58.016312 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:02:38.977721 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:101: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=crio: (2m26.592651806s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/StartCluster (147.50s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (7.04s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 kubectl -- rollout status deployment/busybox: (4.444998457s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-29llh -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-97s66 -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-tzcb6 -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-29llh -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-97s66 -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-tzcb6 -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-29llh -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-97s66 -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-tzcb6 -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (7.04s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.5s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-29llh -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-29llh -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-97s66 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-97s66 -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-tzcb6 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 kubectl -- exec busybox-7b57f96db7-tzcb6 -- sh -c "ping -c 1 192.168.49.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.50s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (34.03s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node add --alsologtostderr -v 5
E1218 01:03:19.389920 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:228: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 node add --alsologtostderr -v 5: (32.924072231s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:234: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5: (1.100875958s)
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (34.03s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.13s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-391566 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.13s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (1.04s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.042379849s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (1.04s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (20.1s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:328: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --output json --alsologtostderr -v 5
ha_test.go:328: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 status --output json --alsologtostderr -v 5: (1.047368729s)
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp testdata/cp-test.txt ha-391566:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile3974556561/001/cp-test_ha-391566.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566:/home/docker/cp-test.txt ha-391566-m02:/home/docker/cp-test_ha-391566_ha-391566-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test_ha-391566_ha-391566-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566:/home/docker/cp-test.txt ha-391566-m03:/home/docker/cp-test_ha-391566_ha-391566-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test_ha-391566_ha-391566-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566:/home/docker/cp-test.txt ha-391566-m04:/home/docker/cp-test_ha-391566_ha-391566-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test_ha-391566_ha-391566-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp testdata/cp-test.txt ha-391566-m02:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m02:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile3974556561/001/cp-test_ha-391566-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m02:/home/docker/cp-test.txt ha-391566:/home/docker/cp-test_ha-391566-m02_ha-391566.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test_ha-391566-m02_ha-391566.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m02:/home/docker/cp-test.txt ha-391566-m03:/home/docker/cp-test_ha-391566-m02_ha-391566-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test_ha-391566-m02_ha-391566-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m02:/home/docker/cp-test.txt ha-391566-m04:/home/docker/cp-test_ha-391566-m02_ha-391566-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test_ha-391566-m02_ha-391566-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp testdata/cp-test.txt ha-391566-m03:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m03:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile3974556561/001/cp-test_ha-391566-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m03:/home/docker/cp-test.txt ha-391566:/home/docker/cp-test_ha-391566-m03_ha-391566.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test_ha-391566-m03_ha-391566.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m03:/home/docker/cp-test.txt ha-391566-m02:/home/docker/cp-test_ha-391566-m03_ha-391566-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test_ha-391566-m03_ha-391566-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m03:/home/docker/cp-test.txt ha-391566-m04:/home/docker/cp-test_ha-391566-m03_ha-391566-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test_ha-391566-m03_ha-391566-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp testdata/cp-test.txt ha-391566-m04:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m04:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile3974556561/001/cp-test_ha-391566-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m04:/home/docker/cp-test.txt ha-391566:/home/docker/cp-test_ha-391566-m04_ha-391566.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566 "sudo cat /home/docker/cp-test_ha-391566-m04_ha-391566.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m04:/home/docker/cp-test.txt ha-391566-m02:/home/docker/cp-test_ha-391566-m04_ha-391566-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m02 "sudo cat /home/docker/cp-test_ha-391566-m04_ha-391566-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 cp ha-391566-m04:/home/docker/cp-test.txt ha-391566-m03:/home/docker/cp-test_ha-391566-m04_ha-391566-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 ssh -n ha-391566-m03 "sudo cat /home/docker/cp-test_ha-391566-m04_ha-391566-m03.txt"
--- PASS: TestMultiControlPlane/serial/CopyFile (20.10s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (12.86s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:365: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node stop m02 --alsologtostderr -v 5
E1218 01:04:00.899183 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:365: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 node stop m02 --alsologtostderr -v 5: (12.054559356s)
ha_test.go:371: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:371: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5: exit status 7 (801.202162ms)

                                                
                                                
-- stdout --
	ha-391566
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-391566-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-391566-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-391566-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 01:04:04.663117 1237421 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:04:04.663320 1237421 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:04:04.663350 1237421 out.go:374] Setting ErrFile to fd 2...
	I1218 01:04:04.663370 1237421 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:04:04.663646 1237421 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:04:04.663860 1237421 out.go:368] Setting JSON to false
	I1218 01:04:04.663923 1237421 mustload.go:66] Loading cluster: ha-391566
	I1218 01:04:04.664023 1237421 notify.go:221] Checking for updates...
	I1218 01:04:04.664417 1237421 config.go:182] Loaded profile config "ha-391566": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:04:04.664462 1237421 status.go:174] checking status of ha-391566 ...
	I1218 01:04:04.665368 1237421 cli_runner.go:164] Run: docker container inspect ha-391566 --format={{.State.Status}}
	I1218 01:04:04.687462 1237421 status.go:371] ha-391566 host status = "Running" (err=<nil>)
	I1218 01:04:04.687483 1237421 host.go:66] Checking if "ha-391566" exists ...
	I1218 01:04:04.687806 1237421 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-391566
	I1218 01:04:04.716403 1237421 host.go:66] Checking if "ha-391566" exists ...
	I1218 01:04:04.716759 1237421 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:04:04.718088 1237421 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-391566
	I1218 01:04:04.740496 1237421 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33930 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/ha-391566/id_rsa Username:docker}
	I1218 01:04:04.849679 1237421 ssh_runner.go:195] Run: systemctl --version
	I1218 01:04:04.856718 1237421 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:04:04.869660 1237421 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:04:04.933620 1237421 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:4 ContainersRunning:3 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:62 OomKillDisable:true NGoroutines:72 SystemTime:2025-12-18 01:04:04.924055512 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:04:04.935069 1237421 kubeconfig.go:125] found "ha-391566" server: "https://192.168.49.254:8443"
	I1218 01:04:04.935116 1237421 api_server.go:166] Checking apiserver status ...
	I1218 01:04:04.935167 1237421 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:04:04.946601 1237421 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1263/cgroup
	I1218 01:04:04.955039 1237421 api_server.go:182] apiserver freezer: "10:freezer:/docker/7ff25377bceeb1d0ab971c7fe3389482c12ea48fbce488f738e7efc7542f71f5/crio/crio-9db859773f437372a452f9d9c3c348586e7518248372831cfd0c74db3966bad3"
	I1218 01:04:04.955110 1237421 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/7ff25377bceeb1d0ab971c7fe3389482c12ea48fbce488f738e7efc7542f71f5/crio/crio-9db859773f437372a452f9d9c3c348586e7518248372831cfd0c74db3966bad3/freezer.state
	I1218 01:04:04.963278 1237421 api_server.go:204] freezer state: "THAWED"
	I1218 01:04:04.963308 1237421 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1218 01:04:04.971387 1237421 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1218 01:04:04.971416 1237421 status.go:463] ha-391566 apiserver status = Running (err=<nil>)
	I1218 01:04:04.971436 1237421 status.go:176] ha-391566 status: &{Name:ha-391566 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:04:04.971454 1237421 status.go:174] checking status of ha-391566-m02 ...
	I1218 01:04:04.971778 1237421 cli_runner.go:164] Run: docker container inspect ha-391566-m02 --format={{.State.Status}}
	I1218 01:04:04.990530 1237421 status.go:371] ha-391566-m02 host status = "Stopped" (err=<nil>)
	I1218 01:04:04.990605 1237421 status.go:384] host is not running, skipping remaining checks
	I1218 01:04:04.990618 1237421 status.go:176] ha-391566-m02 status: &{Name:ha-391566-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:04:04.990644 1237421 status.go:174] checking status of ha-391566-m03 ...
	I1218 01:04:04.991031 1237421 cli_runner.go:164] Run: docker container inspect ha-391566-m03 --format={{.State.Status}}
	I1218 01:04:05.015535 1237421 status.go:371] ha-391566-m03 host status = "Running" (err=<nil>)
	I1218 01:04:05.015559 1237421 host.go:66] Checking if "ha-391566-m03" exists ...
	I1218 01:04:05.015869 1237421 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-391566-m03
	I1218 01:04:05.035644 1237421 host.go:66] Checking if "ha-391566-m03" exists ...
	I1218 01:04:05.035977 1237421 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:04:05.036035 1237421 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-391566-m03
	I1218 01:04:05.053683 1237421 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33940 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/ha-391566-m03/id_rsa Username:docker}
	I1218 01:04:05.158684 1237421 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:04:05.172523 1237421 kubeconfig.go:125] found "ha-391566" server: "https://192.168.49.254:8443"
	I1218 01:04:05.172555 1237421 api_server.go:166] Checking apiserver status ...
	I1218 01:04:05.172600 1237421 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:04:05.183727 1237421 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1199/cgroup
	I1218 01:04:05.192948 1237421 api_server.go:182] apiserver freezer: "10:freezer:/docker/4c0613d8a444e97a690897d49dc5a70bc90248860ee0d197f3be52dfe8721ce5/crio/crio-00ba21f8cd610fa7ee320213bb7a3304a3d94b818ac001c26bf90f71c7549898"
	I1218 01:04:05.193045 1237421 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/4c0613d8a444e97a690897d49dc5a70bc90248860ee0d197f3be52dfe8721ce5/crio/crio-00ba21f8cd610fa7ee320213bb7a3304a3d94b818ac001c26bf90f71c7549898/freezer.state
	I1218 01:04:05.200308 1237421 api_server.go:204] freezer state: "THAWED"
	I1218 01:04:05.200336 1237421 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1218 01:04:05.209150 1237421 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1218 01:04:05.209187 1237421 status.go:463] ha-391566-m03 apiserver status = Running (err=<nil>)
	I1218 01:04:05.209198 1237421 status.go:176] ha-391566-m03 status: &{Name:ha-391566-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:04:05.209217 1237421 status.go:174] checking status of ha-391566-m04 ...
	I1218 01:04:05.209539 1237421 cli_runner.go:164] Run: docker container inspect ha-391566-m04 --format={{.State.Status}}
	I1218 01:04:05.227892 1237421 status.go:371] ha-391566-m04 host status = "Running" (err=<nil>)
	I1218 01:04:05.227919 1237421 host.go:66] Checking if "ha-391566-m04" exists ...
	I1218 01:04:05.228349 1237421 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-391566-m04
	I1218 01:04:05.244740 1237421 host.go:66] Checking if "ha-391566-m04" exists ...
	I1218 01:04:05.245054 1237421 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:04:05.245095 1237421 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-391566-m04
	I1218 01:04:05.262233 1237421 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33945 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/ha-391566-m04/id_rsa Username:docker}
	I1218 01:04:05.365684 1237421 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:04:05.384124 1237421 status.go:176] ha-391566-m04 status: &{Name:ha-391566-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopSecondaryNode (12.86s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.85s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.85s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (30.32s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node start m02 --alsologtostderr -v 5
E1218 01:04:32.021885 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:422: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 node start m02 --alsologtostderr -v 5: (28.814735685s)
ha_test.go:430: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:430: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5: (1.376822091s)
ha_test.go:450: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiControlPlane/serial/RestartSecondaryNode (30.32s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.36s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.355261336s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.36s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (124.8s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:458: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node list --alsologtostderr -v 5
ha_test.go:464: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 stop --alsologtostderr -v 5
ha_test.go:464: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 stop --alsologtostderr -v 5: (27.499430698s)
ha_test.go:469: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 start --wait true --alsologtostderr -v 5
E1218 01:06:17.039864 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:06:22.463696 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:469: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 start --wait true --alsologtostderr -v 5: (1m37.116324963s)
ha_test.go:474: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node list --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/RestartClusterKeepsNodes (124.80s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.69s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:489: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node delete m03 --alsologtostderr -v 5
E1218 01:06:44.741335 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:489: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 node delete m03 --alsologtostderr -v 5: (10.734438242s)
ha_test.go:495: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:513: (dbg) Run:  kubectl get nodes
ha_test.go:521: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/DeleteSecondaryNode (11.69s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.79s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.79s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (36.1s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:533: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 stop --alsologtostderr -v 5
ha_test.go:533: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 stop --alsologtostderr -v 5: (35.990541777s)
ha_test.go:539: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:539: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5: exit status 7 (112.490638ms)

                                                
                                                
-- stdout --
	ha-391566
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-391566-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-391566-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 01:07:31.226022 1249368 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:07:31.226204 1249368 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:07:31.226234 1249368 out.go:374] Setting ErrFile to fd 2...
	I1218 01:07:31.226254 1249368 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:07:31.226844 1249368 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:07:31.227080 1249368 out.go:368] Setting JSON to false
	I1218 01:07:31.227145 1249368 mustload.go:66] Loading cluster: ha-391566
	I1218 01:07:31.227237 1249368 notify.go:221] Checking for updates...
	I1218 01:07:31.227637 1249368 config.go:182] Loaded profile config "ha-391566": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:07:31.227683 1249368 status.go:174] checking status of ha-391566 ...
	I1218 01:07:31.228198 1249368 cli_runner.go:164] Run: docker container inspect ha-391566 --format={{.State.Status}}
	I1218 01:07:31.247554 1249368 status.go:371] ha-391566 host status = "Stopped" (err=<nil>)
	I1218 01:07:31.247574 1249368 status.go:384] host is not running, skipping remaining checks
	I1218 01:07:31.247580 1249368 status.go:176] ha-391566 status: &{Name:ha-391566 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:07:31.247608 1249368 status.go:174] checking status of ha-391566-m02 ...
	I1218 01:07:31.247907 1249368 cli_runner.go:164] Run: docker container inspect ha-391566-m02 --format={{.State.Status}}
	I1218 01:07:31.277232 1249368 status.go:371] ha-391566-m02 host status = "Stopped" (err=<nil>)
	I1218 01:07:31.277253 1249368 status.go:384] host is not running, skipping remaining checks
	I1218 01:07:31.277260 1249368 status.go:176] ha-391566-m02 status: &{Name:ha-391566-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:07:31.277279 1249368 status.go:174] checking status of ha-391566-m04 ...
	I1218 01:07:31.277581 1249368 cli_runner.go:164] Run: docker container inspect ha-391566-m04 --format={{.State.Status}}
	I1218 01:07:31.293287 1249368 status.go:371] ha-391566-m04 host status = "Stopped" (err=<nil>)
	I1218 01:07:31.293306 1249368 status.go:384] host is not running, skipping remaining checks
	I1218 01:07:31.293314 1249368 status.go:176] ha-391566-m04 status: &{Name:ha-391566-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (36.10s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (82.74s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:562: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=crio
E1218 01:08:19.390073 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:562: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=crio: (1m21.718213126s)
ha_test.go:568: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:586: (dbg) Run:  kubectl get nodes
ha_test.go:594: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/RestartCluster (82.74s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.81s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.81s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (53.38s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:607: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 node add --control-plane --alsologtostderr -v 5
E1218 01:09:15.104414 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:09:32.021454 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:607: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 node add --control-plane --alsologtostderr -v 5: (52.324729554s)
ha_test.go:613: (dbg) Run:  out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5
ha_test.go:613: (dbg) Done: out/minikube-linux-arm64 -p ha-391566 status --alsologtostderr -v 5: (1.052033838s)
--- PASS: TestMultiControlPlane/serial/AddSecondaryNode (53.38s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.05s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.053627095s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.05s)

                                                
                                    
x
+
TestJSONOutput/start/Command (52.59s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-368600 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=crio
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p json-output-368600 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=crio: (52.581541651s)
--- PASS: TestJSONOutput/start/Command (52.59s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (5.81s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 stop -p json-output-368600 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 stop -p json-output-368600 --output=json --user=testUser: (5.810151669s)
--- PASS: TestJSONOutput/stop/Command (5.81s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.24s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-error-651301 --memory=3072 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p json-output-error-651301 --memory=3072 --output=json --wait=true --driver=fail: exit status 56 (96.656484ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"94e8d61c-d17c-40f0-b5e0-92af39ead2e2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-651301] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"78890313-7712-4436-aa24-c9ae9795e85b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=22186"}}
	{"specversion":"1.0","id":"4a59daf1-8f29-46db-ae1c-b13137b58b26","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"35a1c128-b029-46a2-830e-b317edeb0338","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig"}}
	{"specversion":"1.0","id":"abf9c93b-2c6c-4c46-9955-78fba906a6a1","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube"}}
	{"specversion":"1.0","id":"668d1d5b-2a2d-4fa4-8f37-59babca89a86","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"cf9c3f2e-8872-4f94-b5df-8b62ccbe98ef","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"f676f93d-8c2a-4a41-a30f-5c0d5c0456ea","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/arm64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:176: Cleaning up "json-output-error-651301" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p json-output-error-651301
--- PASS: TestErrorJSONOutput (0.24s)

                                                
                                    
x
+
TestKicCustomNetwork/create_custom_network (39.95s)

                                                
                                                
=== RUN   TestKicCustomNetwork/create_custom_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-102821 --network=
E1218 01:11:17.040344 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-102821 --network=: (37.72468405s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:176: Cleaning up "docker-network-102821" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-102821
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-102821: (2.203796483s)
--- PASS: TestKicCustomNetwork/create_custom_network (39.95s)

                                                
                                    
x
+
TestKicCustomNetwork/use_default_bridge_network (34.73s)

                                                
                                                
=== RUN   TestKicCustomNetwork/use_default_bridge_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-915853 --network=bridge
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-915853 --network=bridge: (32.591456632s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:176: Cleaning up "docker-network-915853" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-915853
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-915853: (2.120321087s)
--- PASS: TestKicCustomNetwork/use_default_bridge_network (34.73s)

                                                
                                    
x
+
TestKicExistingNetwork (37.86s)

                                                
                                                
=== RUN   TestKicExistingNetwork
I1218 01:12:19.422662 1159552 cli_runner.go:164] Run: docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
W1218 01:12:19.437854 1159552 cli_runner.go:211] docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
I1218 01:12:19.437948 1159552 network_create.go:284] running [docker network inspect existing-network] to gather additional debugging logs...
I1218 01:12:19.437972 1159552 cli_runner.go:164] Run: docker network inspect existing-network
W1218 01:12:19.453920 1159552 cli_runner.go:211] docker network inspect existing-network returned with exit code 1
I1218 01:12:19.453951 1159552 network_create.go:287] error running [docker network inspect existing-network]: docker network inspect existing-network: exit status 1
stdout:
[]

                                                
                                                
stderr:
Error response from daemon: network existing-network not found
I1218 01:12:19.453965 1159552 network_create.go:289] output of [docker network inspect existing-network]: -- stdout --
[]

                                                
                                                
-- /stdout --
** stderr ** 
Error response from daemon: network existing-network not found

                                                
                                                
** /stderr **
I1218 01:12:19.454102 1159552 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1218 01:12:19.472534 1159552 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-6457214f0a50 IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:0a:67:20:e8:65:78} reservation:<nil>}
I1218 01:12:19.472852 1159552 network.go:206] using free private subnet 192.168.58.0/24: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001a45bb0}
I1218 01:12:19.472878 1159552 network_create.go:124] attempt to create docker network existing-network 192.168.58.0/24 with gateway 192.168.58.1 and MTU of 1500 ...
I1218 01:12:19.472928 1159552 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.58.0/24 --gateway=192.168.58.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=existing-network existing-network
I1218 01:12:19.535803 1159552 network_create.go:108] docker network existing-network 192.168.58.0/24 created
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
kic_custom_network_test.go:93: (dbg) Run:  out/minikube-linux-arm64 start -p existing-network-823246 --network=existing-network
kic_custom_network_test.go:93: (dbg) Done: out/minikube-linux-arm64 start -p existing-network-823246 --network=existing-network: (35.594155956s)
helpers_test.go:176: Cleaning up "existing-network-823246" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p existing-network-823246
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p existing-network-823246: (2.12361808s)
I1218 01:12:57.270844 1159552 cli_runner.go:164] Run: docker network ls --filter=label=existing-network --format {{.Name}}
--- PASS: TestKicExistingNetwork (37.86s)

                                                
                                    
x
+
TestKicCustomSubnet (34.94s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-subnet-405200 --subnet=192.168.60.0/24
E1218 01:13:19.390576 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-subnet-405200 --subnet=192.168.60.0/24: (32.624032971s)
kic_custom_network_test.go:161: (dbg) Run:  docker network inspect custom-subnet-405200 --format "{{(index .IPAM.Config 0).Subnet}}"
helpers_test.go:176: Cleaning up "custom-subnet-405200" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p custom-subnet-405200
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p custom-subnet-405200: (2.279207152s)
--- PASS: TestKicCustomSubnet (34.94s)

                                                
                                    
x
+
TestKicStaticIP (36.04s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:132: (dbg) Run:  out/minikube-linux-arm64 start -p static-ip-468352 --static-ip=192.168.200.200
kic_custom_network_test.go:132: (dbg) Done: out/minikube-linux-arm64 start -p static-ip-468352 --static-ip=192.168.200.200: (33.609509227s)
kic_custom_network_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p static-ip-468352 ip
helpers_test.go:176: Cleaning up "static-ip-468352" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p static-ip-468352
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p static-ip-468352: (2.245945237s)
--- PASS: TestKicStaticIP (36.04s)

                                                
                                    
x
+
TestMainNoArgs (0.05s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:70: (dbg) Run:  out/minikube-linux-arm64
--- PASS: TestMainNoArgs (0.05s)

                                                
                                    
x
+
TestMinikubeProfile (70.31s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p first-642849 --driver=docker  --container-runtime=crio
E1218 01:14:32.021431 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p first-642849 --driver=docker  --container-runtime=crio: (31.513677633s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p second-645787 --driver=docker  --container-runtime=crio
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p second-645787 --driver=docker  --container-runtime=crio: (32.774354769s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile first-642849
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile second-645787
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
helpers_test.go:176: Cleaning up "second-645787" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p second-645787
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p second-645787: (2.151796407s)
helpers_test.go:176: Cleaning up "first-642849" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p first-642849
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p first-642849: (2.449050343s)
--- PASS: TestMinikubeProfile (70.31s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (9.15s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-1-619403 --memory=3072 --mount-string /tmp/TestMountStartserial515780528/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=crio
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-1-619403 --memory=3072 --mount-string /tmp/TestMountStartserial515780528/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=crio: (8.146619413s)
--- PASS: TestMountStart/serial/StartWithMountFirst (9.15s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-1-619403 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (8.59s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-621273 --memory=3072 --mount-string /tmp/TestMountStartserial515780528/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=crio
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-621273 --memory=3072 --mount-string /tmp/TestMountStartserial515780528/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=crio: (7.588208523s)
--- PASS: TestMountStart/serial/StartWithMountSecond (8.59s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-621273 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountSecond (0.28s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (1.72s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p mount-start-1-619403 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p mount-start-1-619403 --alsologtostderr -v=5: (1.719694844s)
--- PASS: TestMountStart/serial/DeleteFirst (1.72s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-621273 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.29s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:196: (dbg) Run:  out/minikube-linux-arm64 stop -p mount-start-2-621273
mount_start_test.go:196: (dbg) Done: out/minikube-linux-arm64 stop -p mount-start-2-621273: (1.292411905s)
--- PASS: TestMountStart/serial/Stop (1.29s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (8.29s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:207: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-621273
mount_start_test.go:207: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-621273: (7.289532884s)
--- PASS: TestMountStart/serial/RestartStopped (8.29s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-621273 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.27s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (80.99s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-771808 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=crio
E1218 01:16:17.040377 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:96: (dbg) Done: out/minikube-linux-arm64 start -p multinode-771808 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=crio: (1m20.4346239s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (80.99s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (5.12s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-arm64 kubectl -p multinode-771808 -- rollout status deployment/busybox: (3.360739672s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-n6qq7 -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-sp4mm -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-n6qq7 -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-sp4mm -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-n6qq7 -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-sp4mm -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (5.12s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.92s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-n6qq7 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-n6qq7 -- sh -c "ping -c 1 192.168.67.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-sp4mm -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-771808 -- exec busybox-7b57f96db7-sp4mm -- sh -c "ping -c 1 192.168.67.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.92s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (30.42s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-771808 -v=5 --alsologtostderr
E1218 01:17:40.102739 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:121: (dbg) Done: out/minikube-linux-arm64 node add -p multinode-771808 -v=5 --alsologtostderr: (29.713506158s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (30.42s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-771808 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.74s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.74s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (10.51s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --output json --alsologtostderr
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp testdata/cp-test.txt multinode-771808:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2609857370/001/cp-test_multinode-771808.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808:/home/docker/cp-test.txt multinode-771808-m02:/home/docker/cp-test_multinode-771808_multinode-771808-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m02 "sudo cat /home/docker/cp-test_multinode-771808_multinode-771808-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808:/home/docker/cp-test.txt multinode-771808-m03:/home/docker/cp-test_multinode-771808_multinode-771808-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m03 "sudo cat /home/docker/cp-test_multinode-771808_multinode-771808-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp testdata/cp-test.txt multinode-771808-m02:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2609857370/001/cp-test_multinode-771808-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808-m02:/home/docker/cp-test.txt multinode-771808:/home/docker/cp-test_multinode-771808-m02_multinode-771808.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808 "sudo cat /home/docker/cp-test_multinode-771808-m02_multinode-771808.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808-m02:/home/docker/cp-test.txt multinode-771808-m03:/home/docker/cp-test_multinode-771808-m02_multinode-771808-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m03 "sudo cat /home/docker/cp-test_multinode-771808-m02_multinode-771808-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp testdata/cp-test.txt multinode-771808-m03:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2609857370/001/cp-test_multinode-771808-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808-m03:/home/docker/cp-test.txt multinode-771808:/home/docker/cp-test_multinode-771808-m03_multinode-771808.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808 "sudo cat /home/docker/cp-test_multinode-771808-m03_multinode-771808.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 cp multinode-771808-m03:/home/docker/cp-test.txt multinode-771808-m02:/home/docker/cp-test_multinode-771808-m03_multinode-771808-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 ssh -n multinode-771808-m02 "sudo cat /home/docker/cp-test_multinode-771808-m03_multinode-771808-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (10.51s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.46s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-arm64 -p multinode-771808 node stop m03: (1.308576668s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-771808 status: exit status 7 (578.981225ms)

                                                
                                                
-- stdout --
	multinode-771808
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-771808-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-771808-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr: exit status 7 (570.806941ms)

                                                
                                                
-- stdout --
	multinode-771808
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-771808-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-771808-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 01:18:01.372074 1299977 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:18:01.372272 1299977 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:18:01.372305 1299977 out.go:374] Setting ErrFile to fd 2...
	I1218 01:18:01.372325 1299977 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:18:01.372613 1299977 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:18:01.372830 1299977 out.go:368] Setting JSON to false
	I1218 01:18:01.372907 1299977 mustload.go:66] Loading cluster: multinode-771808
	I1218 01:18:01.372981 1299977 notify.go:221] Checking for updates...
	I1218 01:18:01.374023 1299977 config.go:182] Loaded profile config "multinode-771808": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:18:01.374072 1299977 status.go:174] checking status of multinode-771808 ...
	I1218 01:18:01.374646 1299977 cli_runner.go:164] Run: docker container inspect multinode-771808 --format={{.State.Status}}
	I1218 01:18:01.395404 1299977 status.go:371] multinode-771808 host status = "Running" (err=<nil>)
	I1218 01:18:01.395425 1299977 host.go:66] Checking if "multinode-771808" exists ...
	I1218 01:18:01.395701 1299977 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-771808
	I1218 01:18:01.423471 1299977 host.go:66] Checking if "multinode-771808" exists ...
	I1218 01:18:01.423778 1299977 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:18:01.423816 1299977 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-771808
	I1218 01:18:01.449482 1299977 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34050 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/multinode-771808/id_rsa Username:docker}
	I1218 01:18:01.554233 1299977 ssh_runner.go:195] Run: systemctl --version
	I1218 01:18:01.562134 1299977 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:18:01.577004 1299977 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1218 01:18:01.640072 1299977 info.go:266] docker info: {ID:5FDH:SA5P:5GCT:NLAS:B73P:SGDQ:PBG5:UBVH:UZY3:RXGO:CI7S:WAIH Containers:3 ContainersRunning:2 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:50 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-18 01:18:01.630414731 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-21-244 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1218 01:18:01.640749 1299977 kubeconfig.go:125] found "multinode-771808" server: "https://192.168.67.2:8443"
	I1218 01:18:01.640784 1299977 api_server.go:166] Checking apiserver status ...
	I1218 01:18:01.640829 1299977 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1218 01:18:01.654165 1299977 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1256/cgroup
	I1218 01:18:01.662829 1299977 api_server.go:182] apiserver freezer: "10:freezer:/docker/3b02cab585d1e15d5eaaf8b7bb9bce990ce70bc93c00f1bcb71791db923a52aa/crio/crio-cbf76718f5fa96f905da89fe863c06fd43f2bc5cc3755e4f5ee6fcc4ef4e0865"
	I1218 01:18:01.662895 1299977 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/3b02cab585d1e15d5eaaf8b7bb9bce990ce70bc93c00f1bcb71791db923a52aa/crio/crio-cbf76718f5fa96f905da89fe863c06fd43f2bc5cc3755e4f5ee6fcc4ef4e0865/freezer.state
	I1218 01:18:01.670257 1299977 api_server.go:204] freezer state: "THAWED"
	I1218 01:18:01.670286 1299977 api_server.go:253] Checking apiserver healthz at https://192.168.67.2:8443/healthz ...
	I1218 01:18:01.678992 1299977 api_server.go:279] https://192.168.67.2:8443/healthz returned 200:
	ok
	I1218 01:18:01.679036 1299977 status.go:463] multinode-771808 apiserver status = Running (err=<nil>)
	I1218 01:18:01.679046 1299977 status.go:176] multinode-771808 status: &{Name:multinode-771808 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:18:01.679069 1299977 status.go:174] checking status of multinode-771808-m02 ...
	I1218 01:18:01.679511 1299977 cli_runner.go:164] Run: docker container inspect multinode-771808-m02 --format={{.State.Status}}
	I1218 01:18:01.699876 1299977 status.go:371] multinode-771808-m02 host status = "Running" (err=<nil>)
	I1218 01:18:01.699901 1299977 host.go:66] Checking if "multinode-771808-m02" exists ...
	I1218 01:18:01.700216 1299977 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-771808-m02
	I1218 01:18:01.722790 1299977 host.go:66] Checking if "multinode-771808-m02" exists ...
	I1218 01:18:01.723123 1299977 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1218 01:18:01.723175 1299977 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-771808-m02
	I1218 01:18:01.744796 1299977 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34055 SSHKeyPath:/home/jenkins/minikube-integration/22186-1156339/.minikube/machines/multinode-771808-m02/id_rsa Username:docker}
	I1218 01:18:01.857498 1299977 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1218 01:18:01.872796 1299977 status.go:176] multinode-771808-m02 status: &{Name:multinode-771808-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:18:01.872836 1299977 status.go:174] checking status of multinode-771808-m03 ...
	I1218 01:18:01.873263 1299977 cli_runner.go:164] Run: docker container inspect multinode-771808-m03 --format={{.State.Status}}
	I1218 01:18:01.890979 1299977 status.go:371] multinode-771808-m03 host status = "Stopped" (err=<nil>)
	I1218 01:18:01.890999 1299977 status.go:384] host is not running, skipping remaining checks
	I1218 01:18:01.891006 1299977 status.go:176] multinode-771808-m03 status: &{Name:multinode-771808-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.46s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (8.27s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 node start m03 -v=5 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-linux-arm64 -p multinode-771808 node start m03 -v=5 --alsologtostderr: (7.456813094s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status -v=5 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (8.27s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (78.26s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-771808
multinode_test.go:321: (dbg) Run:  out/minikube-linux-arm64 stop -p multinode-771808
E1218 01:18:19.390531 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:321: (dbg) Done: out/minikube-linux-arm64 stop -p multinode-771808: (25.062494421s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-771808 --wait=true -v=5 --alsologtostderr
multinode_test.go:326: (dbg) Done: out/minikube-linux-arm64 start -p multinode-771808 --wait=true -v=5 --alsologtostderr: (53.070944907s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-771808
--- PASS: TestMultiNode/serial/RestartKeepsNodes (78.26s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (5.68s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 node delete m03
E1218 01:19:32.021390 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:416: (dbg) Done: out/minikube-linux-arm64 -p multinode-771808 node delete m03: (4.988359443s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (5.68s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (24s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 stop
multinode_test.go:345: (dbg) Done: out/minikube-linux-arm64 -p multinode-771808 stop: (23.809875661s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-771808 status: exit status 7 (93.411ms)

                                                
                                                
-- stdout --
	multinode-771808
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-771808-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr: exit status 7 (93.856444ms)

                                                
                                                
-- stdout --
	multinode-771808
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-771808-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1218 01:19:58.062453 1307789 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:19:58.062638 1307789 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:19:58.062663 1307789 out.go:374] Setting ErrFile to fd 2...
	I1218 01:19:58.062683 1307789 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:19:58.062953 1307789 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:19:58.063176 1307789 out.go:368] Setting JSON to false
	I1218 01:19:58.063237 1307789 mustload.go:66] Loading cluster: multinode-771808
	I1218 01:19:58.063313 1307789 notify.go:221] Checking for updates...
	I1218 01:19:58.064684 1307789 config.go:182] Loaded profile config "multinode-771808": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:19:58.064742 1307789 status.go:174] checking status of multinode-771808 ...
	I1218 01:19:58.065434 1307789 cli_runner.go:164] Run: docker container inspect multinode-771808 --format={{.State.Status}}
	I1218 01:19:58.085029 1307789 status.go:371] multinode-771808 host status = "Stopped" (err=<nil>)
	I1218 01:19:58.085050 1307789 status.go:384] host is not running, skipping remaining checks
	I1218 01:19:58.085057 1307789 status.go:176] multinode-771808 status: &{Name:multinode-771808 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1218 01:19:58.085089 1307789 status.go:174] checking status of multinode-771808-m02 ...
	I1218 01:19:58.085409 1307789 cli_runner.go:164] Run: docker container inspect multinode-771808-m02 --format={{.State.Status}}
	I1218 01:19:58.104573 1307789 status.go:371] multinode-771808-m02 host status = "Stopped" (err=<nil>)
	I1218 01:19:58.104599 1307789 status.go:384] host is not running, skipping remaining checks
	I1218 01:19:58.104606 1307789 status.go:176] multinode-771808-m02 status: &{Name:multinode-771808-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (24.00s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (47.41s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-771808 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=crio
multinode_test.go:376: (dbg) Done: out/minikube-linux-arm64 start -p multinode-771808 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=crio: (46.628243777s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-arm64 -p multinode-771808 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (47.41s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (35.48s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-771808
multinode_test.go:464: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-771808-m02 --driver=docker  --container-runtime=crio
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p multinode-771808-m02 --driver=docker  --container-runtime=crio: exit status 14 (90.214693ms)

                                                
                                                
-- stdout --
	* [multinode-771808-m02] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-771808-m02' is duplicated with machine name 'multinode-771808-m02' in profile 'multinode-771808'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-771808-m03 --driver=docker  --container-runtime=crio
E1218 01:21:17.039316 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:472: (dbg) Done: out/minikube-linux-arm64 start -p multinode-771808-m03 --driver=docker  --container-runtime=crio: (32.900337337s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-771808
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-arm64 node add -p multinode-771808: exit status 80 (357.855614ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-771808 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-771808-m03 already exists in multinode-771808-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-arm64 delete -p multinode-771808-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-arm64 delete -p multinode-771808-m03: (2.078733605s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (35.48s)

                                                
                                    
x
+
TestPreload (118.4s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:41: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-639199 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=crio
preload_test.go:41: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-639199 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=crio: (1m0.965589346s)
preload_test.go:49: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-639199 image pull gcr.io/k8s-minikube/busybox
preload_test.go:49: (dbg) Done: out/minikube-linux-arm64 -p test-preload-639199 image pull gcr.io/k8s-minikube/busybox: (2.003691413s)
preload_test.go:55: (dbg) Run:  out/minikube-linux-arm64 stop -p test-preload-639199
preload_test.go:55: (dbg) Done: out/minikube-linux-arm64 stop -p test-preload-639199: (5.914948416s)
preload_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-639199 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=crio
E1218 01:23:02.465045 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:23:19.390314 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
preload_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-639199 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=crio: (46.759896242s)
preload_test.go:68: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-639199 image list
helpers_test.go:176: Cleaning up "test-preload-639199" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p test-preload-639199
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p test-preload-639199: (2.503093394s)
--- PASS: TestPreload (118.40s)

                                                
                                    
x
+
TestScheduledStopUnix (110.52s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-arm64 start -p scheduled-stop-405443 --memory=3072 --driver=docker  --container-runtime=crio
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-arm64 start -p scheduled-stop-405443 --memory=3072 --driver=docker  --container-runtime=crio: (33.897264983s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-405443 --schedule 5m -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1218 01:23:57.706470 1321850 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:23:57.706579 1321850 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:23:57.706584 1321850 out.go:374] Setting ErrFile to fd 2...
	I1218 01:23:57.706588 1321850 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:23:57.706842 1321850 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:23:57.707074 1321850 out.go:368] Setting JSON to false
	I1218 01:23:57.707175 1321850 mustload.go:66] Loading cluster: scheduled-stop-405443
	I1218 01:23:57.707510 1321850 config.go:182] Loaded profile config "scheduled-stop-405443": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:23:57.707576 1321850 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/config.json ...
	I1218 01:23:57.707736 1321850 mustload.go:66] Loading cluster: scheduled-stop-405443
	I1218 01:23:57.707851 1321850 config.go:182] Loaded profile config "scheduled-stop-405443": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:204: (dbg) Run:  out/minikube-linux-arm64 status --format={{.TimeToStop}} -p scheduled-stop-405443 -n scheduled-stop-405443
scheduled_stop_test.go:172: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-405443 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1218 01:23:58.139276 1321940 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:23:58.139433 1321940 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:23:58.139445 1321940 out.go:374] Setting ErrFile to fd 2...
	I1218 01:23:58.139451 1321940 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:23:58.139790 1321940 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:23:58.140070 1321940 out.go:368] Setting JSON to false
	I1218 01:23:58.140266 1321940 daemonize_unix.go:73] killing process 1321869 as it is an old scheduled stop
	I1218 01:23:58.140346 1321940 mustload.go:66] Loading cluster: scheduled-stop-405443
	I1218 01:23:58.144641 1321940 config.go:182] Loaded profile config "scheduled-stop-405443": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:23:58.144737 1321940 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/config.json ...
	I1218 01:23:58.145007 1321940 mustload.go:66] Loading cluster: scheduled-stop-405443
	I1218 01:23:58.145137 1321940 config.go:182] Loaded profile config "scheduled-stop-405443": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
I1218 01:23:58.150567 1159552 retry.go:31] will retry after 94.692µs: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.151718 1159552 retry.go:31] will retry after 97.144µs: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.152823 1159552 retry.go:31] will retry after 178.507µs: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.153937 1159552 retry.go:31] will retry after 447.28µs: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.154993 1159552 retry.go:31] will retry after 528.756µs: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.156100 1159552 retry.go:31] will retry after 787.277µs: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.157159 1159552 retry.go:31] will retry after 1.129101ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.159319 1159552 retry.go:31] will retry after 1.964921ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.161510 1159552 retry.go:31] will retry after 1.318206ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.163710 1159552 retry.go:31] will retry after 2.869425ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.166757 1159552 retry.go:31] will retry after 3.309613ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.170998 1159552 retry.go:31] will retry after 4.663737ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.176236 1159552 retry.go:31] will retry after 15.710759ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.192476 1159552 retry.go:31] will retry after 25.279473ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.219151 1159552 retry.go:31] will retry after 31.377191ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
I1218 01:23:58.251403 1159552 retry.go:31] will retry after 60.989921ms: open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/pid: no such file or directory
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-405443 --cancel-scheduled
minikube stop output:

                                                
                                                
-- stdout --
	* All existing scheduled stops cancelled

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-405443 -n scheduled-stop-405443
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-405443
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-405443 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1218 01:24:24.096625 1322412 out.go:360] Setting OutFile to fd 1 ...
	I1218 01:24:24.096834 1322412 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:24:24.096864 1322412 out.go:374] Setting ErrFile to fd 2...
	I1218 01:24:24.096885 1322412 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1218 01:24:24.097168 1322412 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22186-1156339/.minikube/bin
	I1218 01:24:24.097455 1322412 out.go:368] Setting JSON to false
	I1218 01:24:24.097585 1322412 mustload.go:66] Loading cluster: scheduled-stop-405443
	I1218 01:24:24.097973 1322412 config.go:182] Loaded profile config "scheduled-stop-405443": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3
	I1218 01:24:24.098081 1322412 profile.go:143] Saving config to /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/scheduled-stop-405443/config.json ...
	I1218 01:24:24.098293 1322412 mustload.go:66] Loading cluster: scheduled-stop-405443
	I1218 01:24:24.098453 1322412 config.go:182] Loaded profile config "scheduled-stop-405443": Driver=docker, ContainerRuntime=crio, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
E1218 01:24:32.021165 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:172: signal error was:  os: process already finished
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-405443
scheduled_stop_test.go:218: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p scheduled-stop-405443: exit status 7 (72.789684ms)

                                                
                                                
-- stdout --
	scheduled-stop-405443
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-405443 -n scheduled-stop-405443
scheduled_stop_test.go:189: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-405443 -n scheduled-stop-405443: exit status 7 (68.820222ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: status error: exit status 7 (may be ok)
helpers_test.go:176: Cleaning up "scheduled-stop-405443" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p scheduled-stop-405443
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p scheduled-stop-405443: (5.029028551s)
--- PASS: TestScheduledStopUnix (110.52s)

                                                
                                    
x
+
TestInsufficientStorage (13.22s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:50: (dbg) Run:  out/minikube-linux-arm64 start -p insufficient-storage-391938 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=crio
status_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p insufficient-storage-391938 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=crio: exit status 26 (10.612913173s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"e854a9c1-d3ea-4c92-b1bd-73a08aa1024a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[insufficient-storage-391938] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"82ced235-2504-4bbb-93f3-9860629a1310","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=22186"}}
	{"specversion":"1.0","id":"9bf92187-b3e5-49b3-9fe4-e9c28af2a450","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"029dc879-51a6-4f38-8374-84db65f837a6","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig"}}
	{"specversion":"1.0","id":"1cacb959-f624-4fa6-b770-a958aed43748","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube"}}
	{"specversion":"1.0","id":"b5cb5a44-33bd-4190-81db-471b144d655b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"94aca085-5c1f-4616-9776-6bbf0072c0ce","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"b7cfcc69-8b6d-4eae-8339-3953c79151ba","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_STORAGE_CAPACITY=100"}}
	{"specversion":"1.0","id":"0ba6e94c-1bf8-4deb-9f08-33e937cf6393","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_AVAILABLE_STORAGE=19"}}
	{"specversion":"1.0","id":"d9f9f769-38aa-4be6-88b8-8f1552ccc670","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"1","message":"Using the docker driver based on user configuration","name":"Selecting Driver","totalsteps":"19"}}
	{"specversion":"1.0","id":"a7d01ff4-089b-49e6-b863-68b019046af6","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"Using Docker driver with root privileges"}}
	{"specversion":"1.0","id":"7b2f0760-2f9a-4725-bb39-38b4d1ace33e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"3","message":"Starting \"insufficient-storage-391938\" primary control-plane node in \"insufficient-storage-391938\" cluster","name":"Starting Node","totalsteps":"19"}}
	{"specversion":"1.0","id":"d33a7c72-5d4c-4837-b289-553efde2b092","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"5","message":"Pulling base image v0.0.48-1765966054-22186 ...","name":"Pulling Base Image","totalsteps":"19"}}
	{"specversion":"1.0","id":"99b7e5e8-4197-4447-9b2c-3579c5484e6a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"8","message":"Creating docker container (CPUs=2, Memory=3072MB) ...","name":"Creating Container","totalsteps":"19"}}
	{"specversion":"1.0","id":"31c71ca5-0787-4a2e-b8af-2a0fabbfcfbb","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"Try one or more of the following to free up space on the device:\n\n\t\t\t1. Run \"docker system prune\" to remove unused Docker data (optionally with \"-a\")\n\t\t\t2. Increase the storage allocated to Docker for Desktop by clicking on:\n\t\t\t\tDocker icon \u003e Preferences \u003e Resources \u003e Disk Image Size\n\t\t\t3. Run \"minikube ssh -- docker system prune\" if using the Docker container runtime","exitcode":"26","issues":"https://github.com/kubernetes/minikube/issues/9024","message":"Docker is out of disk space! (/var is at 100% of capacity). You can pass '--force' to skip this check.","name":"RSRC_DOCKER_STORAGE","url":""}}

                                                
                                                
-- /stdout --
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-391938 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-391938 --output=json --layout=cluster: exit status 7 (315.918411ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-391938","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","Step":"Creating Container","StepDetail":"Creating docker container (CPUs=2, Memory=3072MB) ...","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-391938","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 01:25:25.187039 1324277 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-391938" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig

                                                
                                                
** /stderr **
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-391938 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-391938 --output=json --layout=cluster: exit status 7 (311.355373ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-391938","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-391938","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1218 01:25:25.504025 1324344 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-391938" does not appear in /home/jenkins/minikube-integration/22186-1156339/kubeconfig
	E1218 01:25:25.513875 1324344 status.go:258] unable to read event log: stat: stat /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/insufficient-storage-391938/events.json: no such file or directory

                                                
                                                
** /stderr **
helpers_test.go:176: Cleaning up "insufficient-storage-391938" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p insufficient-storage-391938
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p insufficient-storage-391938: (1.975259517s)
--- PASS: TestInsufficientStorage (13.22s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (295.98s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.35.0.831864282 start -p running-upgrade-850997 --memory=3072 --vm-driver=docker  --container-runtime=crio
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.35.0.831864282 start -p running-upgrade-850997 --memory=3072 --vm-driver=docker  --container-runtime=crio: (30.032599606s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-arm64 start -p running-upgrade-850997 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
E1218 01:33:19.390580 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:34:20.104126 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:34:32.021028 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:36:17.039882 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-arm64 start -p running-upgrade-850997 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio: (4m22.766670128s)
helpers_test.go:176: Cleaning up "running-upgrade-850997" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p running-upgrade-850997
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p running-upgrade-850997: (1.96030856s)
--- PASS: TestRunningBinaryUpgrade (295.98s)

                                                
                                    
x
+
TestMissingContainerUpgrade (114.44s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
=== PAUSE TestMissingContainerUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestMissingContainerUpgrade
version_upgrade_test.go:309: (dbg) Run:  /tmp/minikube-v1.35.0.2880058732 start -p missing-upgrade-381437 --memory=3072 --driver=docker  --container-runtime=crio
E1218 01:25:55.106560 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:309: (dbg) Done: /tmp/minikube-v1.35.0.2880058732 start -p missing-upgrade-381437 --memory=3072 --driver=docker  --container-runtime=crio: (1m3.116998511s)
version_upgrade_test.go:318: (dbg) Run:  docker stop missing-upgrade-381437
version_upgrade_test.go:323: (dbg) Run:  docker rm missing-upgrade-381437
version_upgrade_test.go:329: (dbg) Run:  out/minikube-linux-arm64 start -p missing-upgrade-381437 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
version_upgrade_test.go:329: (dbg) Done: out/minikube-linux-arm64 start -p missing-upgrade-381437 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio: (46.659515138s)
helpers_test.go:176: Cleaning up "missing-upgrade-381437" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p missing-upgrade-381437
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p missing-upgrade-381437: (2.359198861s)
--- PASS: TestMissingContainerUpgrade (114.44s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-573547 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=crio
no_kubernetes_test.go:108: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p NoKubernetes-573547 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=crio: exit status 14 (94.484811ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-573547] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22186
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22186-1156339/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22186-1156339/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (45.83s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:120: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-573547 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio
no_kubernetes_test.go:120: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-573547 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio: (45.222312866s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-573547 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (45.83s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (8.38s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:137: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio
E1218 01:26:17.040155 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
no_kubernetes_test.go:137: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio: (5.557656689s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-573547 status -o json
no_kubernetes_test.go:225: (dbg) Non-zero exit: out/minikube-linux-arm64 -p NoKubernetes-573547 status -o json: exit status 2 (380.752256ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-573547","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:149: (dbg) Run:  out/minikube-linux-arm64 delete -p NoKubernetes-573547
no_kubernetes_test.go:149: (dbg) Done: out/minikube-linux-arm64 delete -p NoKubernetes-573547: (2.437144364s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (8.38s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (9.15s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:161: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio
no_kubernetes_test.go:161: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-573547 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=crio: (9.148429532s)
--- PASS: TestNoKubernetes/serial/Start (9.15s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads
no_kubernetes_test.go:89: Checking cache directory: /home/jenkins/minikube-integration/22186-1156339/.minikube/cache/linux/arm64/v0.0.0
--- PASS: TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.42s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-573547 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-573547 "sudo systemctl is-active --quiet service kubelet": exit status 1 (416.958015ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.42s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (3.19s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:194: (dbg) Run:  out/minikube-linux-arm64 profile list
no_kubernetes_test.go:204: (dbg) Run:  out/minikube-linux-arm64 profile list --output=json
no_kubernetes_test.go:204: (dbg) Done: out/minikube-linux-arm64 profile list --output=json: (2.409656018s)
--- PASS: TestNoKubernetes/serial/ProfileList (3.19s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (1.29s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:183: (dbg) Run:  out/minikube-linux-arm64 stop -p NoKubernetes-573547
no_kubernetes_test.go:183: (dbg) Done: out/minikube-linux-arm64 stop -p NoKubernetes-573547: (1.291373171s)
--- PASS: TestNoKubernetes/serial/Stop (1.29s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (7.07s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:216: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-573547 --driver=docker  --container-runtime=crio
no_kubernetes_test.go:216: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-573547 --driver=docker  --container-runtime=crio: (7.068190809s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (7.07s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-573547 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-573547 "sudo systemctl is-active --quiet service kubelet": exit status 1 (279.311528ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (1.04s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (1.04s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (303.32s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.35.0.367099631 start -p stopped-upgrade-156815 --memory=3072 --vm-driver=docker  --container-runtime=crio
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.35.0.367099631 start -p stopped-upgrade-156815 --memory=3072 --vm-driver=docker  --container-runtime=crio: (33.9087597s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.35.0.367099631 -p stopped-upgrade-156815 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.35.0.367099631 -p stopped-upgrade-156815 stop: (1.252183954s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-arm64 start -p stopped-upgrade-156815 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
E1218 01:28:19.390180 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:29:32.021448 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/addons-399099/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1218 01:31:17.040030 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-288604/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-arm64 start -p stopped-upgrade-156815 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio: (4m28.157875227s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (303.32s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (1.71s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-arm64 logs -p stopped-upgrade-156815
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-arm64 logs -p stopped-upgrade-156815: (1.713624608s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (1.71s)

                                                
                                    
x
+
TestPause/serial/Start (54.09s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -p pause-022448 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=crio
E1218 01:38:19.390659 1159552 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22186-1156339/.minikube/profiles/functional-240845/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
pause_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -p pause-022448 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=crio: (54.089201612s)
--- PASS: TestPause/serial/Start (54.09s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (28.05s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-arm64 start -p pause-022448 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio
pause_test.go:92: (dbg) Done: out/minikube-linux-arm64 start -p pause-022448 --alsologtostderr -v=1 --driver=docker  --container-runtime=crio: (28.024818364s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (28.05s)

                                                
                                    

Test skip (36/316)

Order skiped test Duration
5 TestDownloadOnly/v1.28.0/cached-images 0
6 TestDownloadOnly/v1.28.0/binaries 0
7 TestDownloadOnly/v1.28.0/kubectl 0
14 TestDownloadOnly/v1.34.3/cached-images 0
15 TestDownloadOnly/v1.34.3/binaries 0
16 TestDownloadOnly/v1.34.3/kubectl 0
23 TestDownloadOnly/v1.35.0-rc.1/cached-images 0
24 TestDownloadOnly/v1.35.0-rc.1/binaries 0
25 TestDownloadOnly/v1.35.0-rc.1/kubectl 0
29 TestDownloadOnlyKic 0.43
31 TestOffline 0
42 TestAddons/serial/GCPAuth/RealCredentials 0
49 TestAddons/parallel/Olm 0
56 TestAddons/parallel/AmdGpuDevicePlugin 0
60 TestDockerFlags 0
63 TestDockerEnvContainerd 0
64 TestHyperKitDriverInstallOrUpdate 0
65 TestHyperkitDriverSkipUpgrade 0
112 TestFunctional/parallel/MySQL 0
116 TestFunctional/parallel/DockerEnv 0
117 TestFunctional/parallel/PodmanEnv 0
130 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0
131 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
132 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0
207 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL 0
211 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv 0
212 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv 0
224 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig 0
225 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
226 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS 0
261 TestGvisorAddon 0
283 TestImageBuild 0
284 TestISOImage 0
348 TestChangeNoneUser 0
351 TestScheduledStopWindows 0
353 TestSkaffold 0
x
+
TestDownloadOnly/v1.28.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.34.3/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.34.3/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.34.3/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0.43s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:231: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p download-docker-540812 --alsologtostderr --driver=docker  --container-runtime=crio
aaa_download_only_test.go:248: Skip for arm64 platform. See https://github.com/kubernetes/minikube/issues/10144
helpers_test.go:176: Cleaning up "download-docker-540812" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p download-docker-540812
--- SKIP: TestDownloadOnlyKic (0.43s)

                                                
                                    
x
+
TestOffline (0s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:35: skipping TestOffline - only docker runtime supported on arm64. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestOffline (0.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/RealCredentials (0s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/RealCredentials
addons_test.go:761: This test requires a GCE instance (excluding Cloud Shell) with a container based driver
--- SKIP: TestAddons/serial/GCPAuth/RealCredentials (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:485: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestAddons/parallel/AmdGpuDevicePlugin (0s)

                                                
                                                
=== RUN   TestAddons/parallel/AmdGpuDevicePlugin
=== PAUSE TestAddons/parallel/AmdGpuDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/AmdGpuDevicePlugin
addons_test.go:1035: skip amd gpu test on all but docker driver and amd64 platform
--- SKIP: TestAddons/parallel/AmdGpuDevicePlugin (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing crio
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with crio true linux arm64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:37: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:101: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctional/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing crio
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing crio
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing crio
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing crio
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestISOImage (0s)

                                                
                                                
=== RUN   TestISOImage
iso_test.go:36: This test requires a VM driver
--- SKIP: TestISOImage (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing crio container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
Copied to clipboard